Search Results

Search found 30217 results on 1209 pages for 'website performance'.

Page 579/1209 | < Previous Page | 575 576 577 578 579 580 581 582 583 584 585 586  | Next Page >

  • Can't make virtual host working

    - by sica07
    I have to create a virtual host on a server which, previously hosted a single website (domain name). Now I'm trying to add a second domain on this server (using the same nameserver). What I've done so far: Initially there was no virtual host so I've made one for the second domain: NameVirtualHost *:80 <VirtualHost *:80> DocumentRoot /var/www/bla ServerName www.blabla.com ServerAlias blabla.com <Directory /var/www/blabla> Order deny,allow Allow from all AllowOverride All </Directory> </VirtualHost *:80> Because nothing happend, I changed the DocumentRoot of the apache server to /var/www (initially was the root document of the first website -/var/www/html) and created a virtual host for the first domain too: <VirtualHost *:80> DocumentRoot /var/www/html ServerName www.first.com ServerAlias first.com <Directory /var/www/html> Order deny,allow Allow from all AllowOverride All </Directory> </VirtualHost *:80> In this case, first.com is working ok, but bla.com not. When I ping blabla.com I get the "unkown host" response. What am I doing wrong? Do I have to modify something in the DNS settings too? Thank you.

    Read the article

  • Disable Crossfire on an ATI HD 5970

    - by cdecker
    I'm trying to create a computation cluster using ATI GPUs. The problem is that I have to disable Crossfire in order to get maximum performance out of the ATI Radeon HD 5970 I bought to get started, but no matter what I do I can't disable it. The problem is that I'd like to run an OpenCL application on the two cores in parallel, but right now they just interfere with each other. Any idea on how to do so under Ubuntu Linux, with ATI Catalyst 10.10?

    Read the article

  • Need help with some IIS7 web.config compression settings.

    - by Pure.Krome
    Hi folks, I'm trying to configure my IIS7 compression settings in my web.config file. I'm trying to enable HTTP 1.0 requests to be gzip. MSDN has all the info about it here. Is it possible to have this config info in my own website's web.config file? Or do i need to set it at an application level? Currently, I have that code in my web.config... <system.webServer> <urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" /> <httpCompression cacheControlHeader="max-age=86400" noCompressionForHttp10="False" noCompressionForProxies="False" sendCacheHeaders="true" /> ... other stuff snipped ... </system.webServer> It's not working :( HTTP 1.1 requests are getting compressed, just not 1.0. That MSDN page above says that it can be used in :- Machine.config ApplicationHost.config Root application Web.config Application Web.config Directory Web.config So, can we set these settings on a per-website-basis, programatically in a web.config file? (this is an Application Web.config file...) What have i done wrong? cheers :) EDIT: I was asked how i know HTTP1.0 is not getting compressed. I'm using the Failed Request Tracing Rules, which reports back:- DYNAMIC_COMPRESSION_START DYNAMIC_COMPRESSION_NOT_SUCESS Reason: 3 Reason: NO_COMPRESSION_10 DYNAMIC_COMPRESSION_END

    Read the article

  • Can an SATA hard drive image be restored onto an SSD?

    - by Bryan Parker
    I'm currently using an SATA hard drive on my primary dev machine, but planning to upgrade to an SSD at some point soon. I use TrueImage on a regular basis to make backups, and to upgrade my harddrive without reinstalling everything. Will I be able to restore and boot onto an SSD? Will there be a performance hit or other issues to watch out for?

    Read the article

  • Monitoring MySQL SELECT/WRITE/UPDATE/SLOW queries in Nagios

    - by imaginative
    There's ways to get performance graphs with several monitoring software packages out there such as ZenOSS. There's a plugin available that will graph MySQL based SELECT/WRITE/SLOW queries in a nice rrd style graph. I'm curious if there is a way to also get similar graphs available in Nagios 3.0? I know Nagios has tools like pnp and can integrate rrd, but is there something readily available that can plugin to monitor those MySQL specifics?

    Read the article

  • Setup site folders on Apache and PHP

    - by Cobus Kruger
    I'm trying to set up my first Apache server on my Windows PC at home and I have real trouble finding out which configuration settings go where. I downloaded and installed XAMPP which seemed to get everything nicely set up and can see a working website on http://localhost. So far so good. The point of this is to develop a website of course, and to make my life easier (irony?), I wanted to let the web site root point to my Eclipse project folder. So I opened httpd-vhosts.conf, uncommented a VirtualHost block and changed its DocumentRoot to my local path. Now when I try to load http://localhost I get a 403 (Access denied) error. So where do I configure permissions for my folder? And is that all I need to let my site run from the folder specified or am I going to have to clear another hurdle? Update: I tried to simplify things a little, so I reinstalled XAMPP and got back to a working http://localhost. Then I confirmed that httpd-vhosts.conf is included in httpd.conf and made the following changes to httpd-vhosts.conf: Uncommented the line NameVirtualHost *:80 Added a virtual host shown below. Restarted Apache and saw the expected page on http://localhost <VirtualHost *:80> DocumentRoot "C:/xampp/htdocs/" ServerName localhost ErrorLog "logs/dummy-host2.localhost-error.log" CustomLog "logs/dummy-host2.localhost-access.log" combined </VirtualHost> I then created a new folder named C:\testweb, added an index.html file and changed the DocumentRoot line shown above. For all intents and purposes I would then expect the two configurations to be equivalent. But this setup gives me an error 403. Even though the C:\testweb folder already had the same permissions as the C:\xampp\htdocs folder, I then went further and gave the Everyone group full control of C:\testweb and got exactly the same problem. So what did I miss?

    Read the article

  • RAID options for a LAMP web server

    - by jetboy
    I'm due to set up a LAMP web server with four drives and a RAID controller to act as a web server. The drives are 146Gb SAS, and the machine has two quad core processors and 16Gb RAM. There will be very few write operations to the MySQL database, and I'll be using as much caching as possible to reduce disk I/O. Question is: Would I be better off splitting the drives into two RAID 1 arrays, splitting up sequential and random disk I/O, or would I get better overall performance putting them all in a single RAID 1+0 array?

    Read the article

  • High Apache CPU usage, but low nginx - Configured correctly?

    - by Buckers
    We've just moved a website of ours over to a brand new high-spec Linux server (1x Intel Xeon E3-1230 v2 @ 3.30GHz, 8GB DDR3 ECC, 2x 128GB SATA SSD RAID1). The server has been configured to use nginx but we're not sure if its working correctly. The site always loads very fast to us (http://www.onedirection.net), but Plesk often sends us reports that the Apache CPU usage percentage reaches high leves, yet when we look at the nginx percentage it's always very low. We've come from a Windows background so are very new to Linux, but shouldn't nginx run INSTEAD of apache? Here's a screenshot from Plesk showing the CPU usage: http://www.pixelkicks.co.uk/_download/plesk.JPG The website gets around 20,000 visitors per day, and we use W3 Total Cache to get it running as fast as possible. MySQL has been optimised well. Memory usage is only running at 2GB of the 8GB. Does this look right? How can we tell that nginx is doing most of the work? Thanks, Chris.

    Read the article

  • How to improve Windows Server 2008 R2 to handle many connections?

    - by invisal
    It has been a few days so far that I am trying to figure how to solve this problem. First of all, I am running a website with an average daily page view of 350,000. Previously, all ads management (tracking click and impression that each ads has served) and content were served in a single server with the following spec: Server 1 OS: Windows 2008 R2 64-Bit CPU: Intel® Core™ i5 - 4 cores RAM: 8 GB Storage: 2 x 1 TB hard drives Bandwidth: 10 TB per month To improve our website speed, I decided to separate the ads management script to another dedicated server because we have more than 15 advertisers to 30 advertisers per each page. Server 2 OS: Windows 2008 R2 64-Bit CPU: Intel® Core™ i5 - 4 cores RAM: 4 GB Storage: 2 x 300 GB hard drives Bandwidth: 10 TB per month The Problem The problem is that Server 1 can handle both content and ads system. Now, that I take away the ads system and put it at Server 2. Server 2 can barely serve only ads system. Test First of all, I moved 75% of the ads to Server 2. And then, perform a ping to server: ping -t xxxxx. [I did the ping for 10 minutes and its following similar pattern as below] Reply from xxxxx bytes=32 time=290ms TTL=116 Reply from xxxxx bytes=32 time=289ms TTL=116 Reply from xxxxx bytes=32 time=320ms TTL=116 Reply from xxxxx bytes=32 time=286ms TTL=116 Reply from xxxxx bytes=32 time=286ms TTL=116 Reply from xxxxx bytes=32 time=348ms TTL=116 Reply from xxxxx bytes=32 time=284ms TTL=116 Then, I moved 100% of the ads to Server 2. Then, perform a ping to server again. [I did the ping for 10 minutes and its following similar pattern as below] Reply from xxxxx bytes=32 time=290ms TTL=116 Request timed out Reply from xxxxx bytes=32 time=320ms TTL=116 Reply from xxxxx bytes=32 time=286ms TTL=116 Request timed out Request timed out Reply from xxxxx bytes=32 time=284ms TTL=116 Attempts Increase MaxUserPort and TcpNumConnection Restart the server Increase IIS Max Instances and Instance MaxRequests Server Resource Only 10%-15% of the network connection is used Only 10%-15% of the CPU is used Only 25% of the memory is used

    Read the article

  • Script Editor in Snow Leopard painfully slow after adding apps to Library

    - by Kio Dane
    I have four different Macs that I use from time to time, and on each of them I notice a constant: adding more items to AppleScript Editor's Library window slows performance of mundane operations (opening a dictionary, switching between Library window and editor window, scrolling in the Library window, etc). In Leopard, I noticed little to no latency in opening a dictionary in Script Editor, but Snow Leopard's AppleScript Editor kills my productivity by making me wait on it with most UI interactions with the Library window.

    Read the article

  • Keep IIS7 Failed Request Tracing as a sysadmin only diagnostic tool?

    - by Kev
    I'm giving some of our customers the ability to manage their sites via IIS Feature Delegation and IIS Manager for Remote Administration. One feature I'm unsure about permitting access to is Failed Request Tracing for the following reasons: Customers will forget to turn it off The server will be taking a performance hit (especially if 500 sites all have it turned on) The server will become littered with old FRT's The potential to leak sensitive information about how the server is configured thus providing useful information to would-be intruders. Should we just keep this as a troubleshooting tool for our own admins?

    Read the article

  • WGet or cURL: Mirror Site from http://site.com And No Internal Access

    - by alharaka
    I have tried wget -m wget -r and a whole bunch of variations. I am getting some of the images on http://site.com, one of the scripts, and none of the CSS, even with the fscking -p parameter. The only HTML page is index.html and there are several more referenced, so I am at a loss. curlmirror.pl on the cURL developers website does not seem to get the job done either. Is there something I am missing? I have tried different levels of recursion with only this URL, but I get the feeling I am missing something. Long story short, some school allows its students to submit web projects, but they want to know how they can collect everything for the instructor who will grade it, instead of him going to all the externally hsoted sites. UPDATE: I think I figured out the issue. I though the links to the other pages were in the index.html page that downloaded. I was way off. Turns out the footer of the page, which has all the navigation links, is handled by a JavaScript file Include.js, which reads JLSSiteMap.js and some other JS files to do page navigation and the like. As a result, wget does not pick up an other dependencies because a lot of this crap is handled not on web pages. How can I handle such a website? This is one of several problem cases. I assume little can be done if wget cannot parse JavaScript.

    Read the article

  • What applications can be used in a Red Hat/CentOS cluster?

    - by Sandra
    Hi, When I look at the Red Hat cluster manuals 1 2, they only explain how to install it but not what applications can use it. I am new to clusters, so I don't know these things =) Let's say I want to 3 node high performance cluster; What applications would work with it? Also, how does an application talk to the cluster? Does the application need to have been written to support clusters? Sandra

    Read the article

  • alternative filesystems for SSD

    - by freedrull
    I am tired of watching fsck check my filesystem when my eeepc 901 shuts down abruptly due to a crash. I know that with a journaling filesystem, I won't have to wait for a check. However, I am well aware of the poor I/O performance of the SSD, so I can imagine using a journaling filesystem being even more frustrating, since there will be constant writes to the journal? I will buy a new laptop without such a crummy ssd someday but, is there anything I can do now, on the software side of things?

    Read the article

  • New drivers, now switchable graphics won't work

    - by Glenn Andersson
    I have an ASUS notebook with dual AMD/ATI graphic cards. Before I've been able to switch individual programs in the Vision Control Center from high performance to energy saving and the other way around. But today I updated the drivers using the AMD Mobility to version 12.10, and now every time I try to configure switchable graphics it either shuts down or does not come up at all. I have a new menu item called Global Switchable Graphics settings though. Any thoughts on this?

    Read the article

  • Intel i7 vs Xeon quad core processor?

    - by jasondavis
    I know the Xeon processors have been around for a long time and are mostly used in servers but I am curious, why do people not use the xeon's in a high performance desktop? As far as I know about the best desktop processor out there now is the i7 line. The i7's and xeons are both quad-core processors, what is the main difference in these? I just saw that the mac pro's are using the quad core xeons instead of the i7's

    Read the article

  • Apache doesn't immediately notice a change in the document root

    - by Tom
    We use capistrano for website deployments and our Apache document root is a symlink to a particular code release. The deployment procedure switches the symlink from the old release to the new release as the final step of the deployment. We are migrating our webservers from real servers running RHEL 5.6 to Amazon EC2 virtual machines running Ubuntu 11.10 and the new servers are suffering from a problem where Apache doesn't immediately notice the change to it's document root when the symlink is switched. It can take a second or so (and I think I've even seen it take a couple of minutes). It's kind of like Apache has cached the physical path of the symlink for some time. Does anyone know some Apache settings I could look at to get it to "scan" for changes to it's served files quicker. Thoughts: I read that the disks on virtual machines are much slower (since they are network attached storage). Perhaps the filesystem cache somehow works differently too? If so, is there anything that can be done? The website runs PHP code. Perhaps there is some PHP config differences between RHEL and Ubuntu? I checked realpath_cache_ttl but both servers have it commented out: e.g. ; Duration of time, in seconds for which to cache realpath information for a given ; file or directory. For systems with rarely changing files, consider increasing this ; value. ; http://www.php.net/manual/en/ini.core.php#ini.realpath-cache-ttl ;realpath_cache_ttl = 120 We do use the APC opcode cache but don't think it's the issue due to experimentation. The PHP code is in different file paths for each deployment and we ensure stat=1. Here is a similar question that is very interesting: 294107 - but doesn't provide an answer for me. One solution would be to reload Apache everytime we modify the document root symlink. I'll do this if we can't find another solution.

    Read the article

  • how does svn work with apache?

    - by ajsie
    i've got ubuntu installed with lamp. im using webdav to upload/download files to/from the ubuntu web server, after i have edited the php source files in netbeans. however, i wonder what is best practice for editing source files and committing these changes to the new website. cause if we are 2-3 developers, i guess we have to use svn. but i have never used it before so i wonder how it works. should i install it and then select the /var/www (apaches webroot) as the repository folder? then when i check in, all the changes will apply immediately? could someone please explain following steps: how to download, edit the source files, upload the files and see the new changes in the website. cause i have only worked with a local apache before, and it was only me. now there will be some more programmers so i have to set up a decent, central environment for this, and have to know how netbeans, svn, webdav and apache works all together. thanks!

    Read the article

  • A separate user for each task?

    - by Mark Tomlin
    I just got a VPS sver the other day, I'm new to server administration, but not that new to Ubuntu (11.04). I use it in my living room as the HTPC, and I had a previous VPS that I used on and off for a team speak server. This one I'm setting up for long term use. So I would like to know the best practice when it comes to websites and tasks that I have the server proforming. I understand that it could be beneficial to separate each website into it's own usergroup or under its own username. I would setup nginx so that it could read all of the users directors (and thus each website) but could not touch anything else. The same with the TeamSpeak, should I make a user for TeamSpeak so that it operates within its own confined area or is this overkill? I do have access to root on the sever and my current plan is to run about 4 websites and a TeamSpeak server. My stack is Linux (Ubuntu 11.04 LTS), nginx, and PHP 5.4.3 (using the PDO SQLite 3 built in driver for the database). Should PHP have it's own user group or is it ok to place it in with nginx?

    Read the article

  • Tomcat service start issue

    - by Srinivasan
    Hi, I have installed tomcat as service successfully. when i start this from control Panel- administrative tools - services, it tows error like this, The Apache Tomcat 4.1 service on local computer started and then stopped. Some service stop automatically if they have no work to do, for example, the performance logs and alerts sevice. Wat is the error?

    Read the article

  • There are lots of "Core i" CPUs, but Dell only offers a few -- who builds systems with the others?

    - by Jesse
    Passmark shows many varieties of Core i3, i5, and i7 cpus. Some of them, even at similar prices, are much faster than others. But Dell only offers a few options, and they're not the fast ones. For example, Dell offers the Core i5 650 (benchmark), which costs $220, and doesn't come close to the performance of the Core i3-2100 (benchmark), which costs $120. Does anyone sell systems with the faster, cheaper chips?

    Read the article

< Previous Page | 575 576 577 578 579 580 581 582 583 584 585 586  | Next Page >