Search Results

Search found 18728 results on 750 pages for 'setup deployment'.

Page 304/750 | < Previous Page | 300 301 302 303 304 305 306 307 308 309 310 311  | Next Page >

  • How to avoid compressing compressed files

    - by Gzorg
    Most compression programs compress all files by default. But when archiving a folder containing already compressed files, there is no need to compress them a second time, such as archives, packed setup program, jpg, movies, mp3,.... Are there any compression programs that allow an arbitrary list of type of files to be stored while the others are still compressed ? It looks like Winrar can't. I expect this would be doable with tar+gz/bzip2 and some scripting in various ways. Edit : Winrar can

    Read the article

  • MS SQL server: Single or multiple instances?

    - by Hugo Riley
    How costly (CPU or memory wise) is it to have multiple instances of SQL server 2005 instead of only one instance with prefixed databases? A company have three application providers. They each will install one application and they each require two or three databases. Should they all use the same instance or should every provider use it's own named instance? Is there any strong reason for one or other setup?

    Read the article

  • Cloning a VM to add a new MySQL Slave

    - by Ben Holness
    I am in the process of adding a new slave to a replicated mysql setup. All of the slave nodes are virtual machines. If I clone one of the nodes to a new VM, then start it with no networking, stop mysql, change the server-id in my.cnf to a new id and then restart mysql and networking, will it all work correctly, or will mysql get confused because it used to be a different server id? OS: Ubuntu 10.10 VM Platform: VMWare 5 MySQL : Server version: 5.1.49-1ubuntu8.1-log (Ubuntu)

    Read the article

  • VirtualBox to use dual monitors

    - by fnord_ix
    I am am running Kubuntu Hardy Heron, with a dual monitor setup, and have VirtualBox on it running Windows XP in seamless mode. My problem is, I can't get virtualbox to extend to the second monitor. Has anyone been able to achieve this or know if it can be achieved?

    Read the article

  • apt-get commands pausing at 'Waiting for headers'

    - by Matt
    I have a VM running Ubuntu Server 9.10 running a basic web server setup. Whenever I run an apt function it will pause for around 1 minute at 'Waiting for headers...'. It will eventually clear through and continue as normal but it is a bit of an annoyance. Everything else on the server seems to run fine. Any ideas?

    Read the article

  • Unable to location existing XP system partition during Windows 7 upgrade install.

    - by glenneroo
    I have Windows XP 32-bit installed on this computer. I just purchased a Windows 7 64-bit as an ISO download upgrade version which I promptly burned to DVD and attempted to perform an upgrade installation. Here is the error message I am getting: Firstly, where are these "Setup log files" located? Second, does this mean I need to find compatible (64-bit?) drivers for the Mainboard and put them on floppy?

    Read the article

  • Can multiple windows users connect to a Mac Mini OS X Server and run applications in parallel?

    - by ilight
    I want to validate the current situation :- I have multiple users who have to use designing applications like Adobe Photoshop, Illustrator etc and maybe some Mac specific applications like iWork and they need to be working on the applications in parallel. Can I setup a Mac Mini OS X Server and create separate user accounts and give to these users so that they can remote login to the OS X Server simultaneously from their Windows machines and use any application they want? In crux, can they share the server resources and applications from their windows machines?

    Read the article

  • Forward spam is dangerous for my domain repute?

    - by Memiux
    I have Postfix with spamassassin and forward the emails (including spam) to gmail.com, my problem is that when I send "legitimate" emails to gmail.com it is marked as spam, I've done everything that the guidelines said like signing with DKIM, setup a SPF for my domains, require authentication for outbound mails, etc. Now I wonder what I'm doing wrong?

    Read the article

  • Nginx not working properly on subdomains

    - by javipas
    I've been trying to setup a Sugar CRM instance. I've got a domain that has its main site on a server (www.domain.com) and I've created a subdomain (sugar.domain.com), but I wnat this subdomain to be hosted on another server. This second server has nginx installed, and there's a working WordPress blog there on a virtualhost, so I would need to setup a second site. To do this I've created the directory structure, and I've created a /etc/nginx/sites-enabled/sugar.domain.com configuration file that has the following: * server { listen 80; server_name sugar.domain.com *.domain.com; access_log /var/www/sugar/log/access.log; error_log /var/www/sugar/log/error.log info; location / { root /var/www/sugar; index index.php; } location ~ .php$ { fastcgi_split_path_info ^(.+\.php)(.*)$; fastcgi_pass backend; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /var/www/sugar/$fastcgi_script_name; include fastcgi_params; fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_intercept_errors on; fastcgi_ignore_client_abort on; fastcgi_read_timeout 180; } ## Disable viewing .htaccess & .htpassword location ~ /\.ht { deny all; } } upstream backend { server 127.0.0.1:9000; } As far as I know, I need the *.domain.com parameter on the "server_name" flag, but something is crashing here: I get either a 403 Forbidden error, or I get PHP code (I can read the PHP file code in the browser, like normal text) that somehow is not executed. I've tried setting permissions to 755 inside the /var/www/sugar/ directory, and I've also set up the owner:group with a chown -R www-data:www-data /var/www/sugar/ The thing is, I don't now if my mistake is in the nginx site configuration, in my folder permissions, or in other place :( Could it be because of the main domain (www.domain.com) is hosted on other server? Do they have to be together necessarily?

    Read the article

  • Is it possible install one instance of Trac for multiple independent projects?

    - by grigy
    I want to set up an SVN/Trac environment for multiple projects, something like the GitHub. It will host multiple projects with multiple developers in each. For simplicity the developers can be independent from other projects. I want to setup this environment for every project automatically, from a registration page. What would be your recommendation? Particularly is it possible to do with Trac?

    Read the article

  • solr Security help

    - by Camran
    I have solr setup with Jetty on my Ubuntu server. On any computer now, I can type my_ip:8983/solr/ and the page will show upp to anybody. How can I disable this so that only I can access that port and the solr admin? Thanks

    Read the article

  • Nginx not working properly on subdomains [SOLVED]

    - by javipas
    I've been trying to setup a Sugar CRM instance. I've got a domain that has its main site on a server (www.domain.com) and I've created a subdomain (sugar.domain.com), but I wnat this subdomain to be hosted on another server. This second server has nginx installed, and there's a working WordPress blog there on a virtualhost, so I would need to setup a second site. To do this I've created the directory structure, and I've created a /etc/nginx/sites-enabled/sugar.domain.com configuration file that has the following: * server { listen 80; server_name sugar.domain.com *.domain.com; access_log /var/www/sugar/log/access.log; error_log /var/www/sugar/log/error.log info; location / { root /var/www/sugar; index index.php; } location ~ .php$ { fastcgi_split_path_info ^(.+\.php)(.*)$; fastcgi_pass backend; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /var/www/sugar/$fastcgi_script_name; include fastcgi_params; fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_intercept_errors on; fastcgi_ignore_client_abort on; fastcgi_read_timeout 180; } ## Disable viewing .htaccess & .htpassword location ~ /\.ht { deny all; } } upstream backend { server 127.0.0.1:9000; } As far as I know, I need the *.domain.com parameter on the "server_name" flag, but something is crashing here: I get either a 403 Forbidden error, or I get PHP code (I can read the PHP file code in the browser, like normal text) that somehow is not executed. I've tried setting permissions to 755 inside the /var/www/sugar/ directory, and I've also set up the owner:group with a chown -R www-data:www-data /var/www/sugar/ The thing is, I don't now if my mistake is in the nginx site configuration, in my folder permissions, or in other place :( Could it be because of the main domain (www.domain.com) is hosted on other server? Do they have to be together necessarily?

    Read the article

  • Node.js Production Server and Ubuntu Users

    - by baffonero
    I'm setting up a production server on Ubuntu 10.04 using this technology stack: Nodejs Nginx to serve static contents Mongo Redis Upstart for running applications as services Monit for monitoring node application and nginx server The server will host only 5 applications of this type. Nothing else. How would you setup Ubuntu Users? It's a good idea to create a User per Application? Would you install software (node, mongo...) as root or as user(s)? Thanks in advance

    Read the article

  • Networking is broken in my VMware 10.0 VMs after upgrading to Windows 8.1

    - by Michael Geary
    I have a VMware Workstation 10.0 installation with several virtual networks including the default host-only and NAT networks. After upgrading to Windows 8.1, the NAT network was not working. I booted an Ubuntu VM with the default network setup that was previously working, and it sat for a long time during startup saying it was waiting for the network. After it finally started up, an ifconfig showed no IP address for eth0. How can I fix the broken network?

    Read the article

  • Copy an existing OS X install from another drive

    - by Kevin Burke
    I just bought a new solid state drive, and I'd like to copy all of the files and setup from my current Mac OS X hard drive onto it. What is the best way to do this? I have a 1TB external hard drive, my drive backed up on Time Machine, and Snow Leopard on a DMG, but no external mount for the SSD, and no install DVD (it's at my parents house, promise). I'm familiar with the command line and booting up Mac OS X from a hard drive.

    Read the article

  • VPC SSH port forward into private subnet

    - by CP510
    Ok, so I've been racking my brain for DAYS on this dilema. I have a VPC setup with a public subnet, and a private subnet. The NAT is in place of course. I can connect from SSH into a instance in the public subnet, as well as the NAT. I can even ssh connect to the private instance from the public instance. I changed the SSHD configuration on the private instance to accept both port 22 and an arbitrary port number 1300. That works fine. But I need to set it up so that I can connect to the private instance directly using the 1300 port number, ie. ssh -i keyfile.pem [email protected] -p 1300 and 1.2.3.4 should route it to the internal server 10.10.10.10. Now I heard iptables is the job for this, so I went ahead and researched and played around with some routing with that. These are the rules I have setup on the public instance (not the NAT). I didn't want to use the NAT for this since AWS apperantly pre-configures the NAT instances when you set them up and I heard using iptables can mess that up. *filter :INPUT ACCEPT [129:12186] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [84:10472] -A INPUT -i lo -j ACCEPT -A INPUT -i eth0 -p tcp -m state --state NEW -m tcp --dport 1300 -j ACCEPT -A INPUT -d 10.10.10.10/32 -p tcp -m limit --limit 5/min -j LOG --log-prefix "SSH Dropped: " -A FORWARD -d 10.10.10.10/32 -p tcp -m tcp --dport 1300 -j ACCEPT -A OUTPUT -o lo -j ACCEPT COMMIT # Completed on Wed Apr 17 04:19:29 2013 # Generated by iptables-save v1.4.12 on Wed Apr 17 04:19:29 2013 *nat :PREROUTING ACCEPT [2:104] :INPUT ACCEPT [2:104] :OUTPUT ACCEPT [6:681] :POSTROUTING ACCEPT [7:745] -A PREROUTING -i eth0 -p tcp -m tcp --dport 1300 -j DNAT --to-destination 10.10.10.10:1300 -A POSTROUTING -p tcp -m tcp --dport 1300 -j MASQUERADE COMMIT So when I try this from home. It just times out. No connection refused messages or anything. And I can't seem to find any log messages about dropped packets. My security groups and ACL settings allow communications on these ports in both directions in both subnets and on the NAT. I'm at a loss. What am I doing wrong?

    Read the article

  • How to create a ghost environment?

    - by manwood
    I want to create a ghost or image of a fresh Windows XP development environment with all the various bits of software installed and ready to go, so that when the OS gets clogged or the main disk fails I can simply install the ghost rather than having to run through the entire install and setup process all over again. What is the best way to go about doing this? Cheers.

    Read the article

  • Do Managed/Unmanaged Switch Really need to have their Cabinets/Racks?

    - by Googooboyy
    Hi guys, I'm new to managed switches and I recently inquired an IT consultant to setup my new office network. For the records, the new office network spans 2 floors, thus he recommended utilising ONE managed switch with a rack/cabinet housing, and the rest using unmanaged switches. Anyways, my main question is: does a managed switch really need to be housed on a rack, or in its own cabinet, to function efficiently? Btw, what are the pros/cons of having a rack/cabinet?

    Read the article

  • rsync not writing files

    - by Cyrcle
    I'm trying to setup rsync to backup a remote directory to my local drive. I cd to the directory that I want to pull the files to, then I enter: rsync -vrtW [email protected]:~/public_html I enter the password then it starts running. I get all the files listed, but none of them actually transfer. What am I missing? Thanks

    Read the article

< Previous Page | 300 301 302 303 304 305 306 307 308 309 310 311  | Next Page >