Search Results

Search found 6231 results on 250 pages for 'slow diver'.

Page 130/250 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • Throttle connections to web service if load gets too high?

    - by Joseph Turian
    I have a web site that communicates via XMLRPC with an XMLRPC server web service. (The web service is written in Python using xmlrpclib.) I believe that xmlrpclib will block while it is handling one request. So if there are three users with an xmlrpclib request ahead of you, your response takes four times as long. How do I handle it if I receive too many XMLRPC requests and the web service gets bogged down and has slow response time? If I am getting slashdotted, my preferred behavior is that the first users get good response times and everyone else is told to come back later. I think this is superior to giving everyone terrible response times. How do I create this behavior? Is this called load-balancing? I am not actually balancing though, until I have multiple servers.

    Read the article

  • How backward compatible are the HSDPA mobile networks

    - by Chris Kimpton
    Hi, I have got this Huawei wifi device, which has been unlocked for other networks. Works fine in UK on Vodafone (as well as 3). We are trying to get it to work with the Claro network in Jamaica. It connects and stays connected, but fails to get a 3g connection, just the slow EDGE one. Claro support say its because Claro currently does not support the 2100MHz frequency for 3G, which is what the device uses Does that sound correct? They say I need one that: Ensure however that these devices can use the 850MHz frequency. My understanding was that the device supports up to 2100, including their 850mhz... I am thinking that maybe the APN is incorrect, but I have set it to the only value I can find on the net, namely: internet.ideasclaro.com.jm Thanks in advance, Chris

    Read the article

  • site timing out when under heavy load

    - by naunu
    My client sends out eblasts at 8am monday/wed/friday. Between 8:15-8:45 the site becomes extremely slow and many users sessions timeout. My setup: Mediatemple VE 2gb dedicated ram (3 burst) Ubuntu 9.10 Apache2-mpm-worker PHP5.3-fcgi MySQL 5 I recently tried to remedy the problem by switching from apache2-mpm-prefork to mpm-worker, but am still having the same issues. My apache settings are: Timeout 100 KeepAlive On MaxKeepAliveRequests 100 <IfModule mpm_worker_module> StartServers 12 MinSpareThreads 25 MaxSpareThreads 96 ThreadLimit 96 ThreadsPerChild 25 MaxClients 225 MaxRequestsPerChild 0 </IfModule> The site is only getting ~10,000 page views during the 8am-9am hour, which I dont think should be stressing the server too badly. Maybe it is an error with the PHP settings, or bandwidth per unit time, or the site outgrew the server? Any suggestions would be very helpful - as you can see i've given it a good go before looking for help (installed mpm-worker). Also, can anyone suggest to me some free load testing software, or a tutorial on mod_status? Thank you

    Read the article

  • Best Solution for Load Balancing geographically distributed NFS File Access?

    - by DairyKnight
    I'm trying to find an optimum solution for accessing the NFS file share in my company. We have a central file server in North America and has 30GB~50GB of updated data everyday. And it's very slow for our Europe and Asia branches to access directly. Therefore, I'm trying to setup two replicate servers in those continents. I'm currently using rsync, but wonder if there exists a better solution acts more like a distributed RAID, which allows the user to transparently access the file whether synced or not. And user request will be dispatched to remote server if the file is not yet synced. I'm now looking into DRBD, but it seems not to have the functionality of auto-dispatching requests. Does anyone know if there's a better solution?

    Read the article

  • Advantage of using nexenta vs. OpenSolaris

    - by jotango
    I am currently building a NAS for about 24 TB of storage. Video files, slow access, long term storage. No performance issues. I am currently undecided between buying a JBOD case and installing OpenSolaris (because of ZFS), or purchasing a Nexenta license. The difference is about $ 12.500 for licenses over three years. What would you see as the main advantage in purchasing a nexenta license, beside the support? Did nexenta really enhance the basic OpenSolaris, or is it just a lot of marketing speak? No one really wanted to answer that question.

    Read the article

  • Problems with show hid jquery

    - by Michael
    I am someone can help... This should be easy but I am lost. <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Show Hide Sample</title> <script src="js/jquery.js" type="text/javascript"></script> <script type="text/javascript"> $(document).ready(function(){ $('#content1').hide(); $('a').click(function(){ $('#content1').show('slow'); }); $('a#close').click(function(){ $('#content1').hide('slow'); }) }); </script> <style> body{font-size:12px; font-family:"Trebuchet MS"; background: #CCF} #content1{ border:1px solid #DDDDDD; padding:10px; margin-top:5px; width:300px; } </style> </head> <body> <a href="#" id="click">Test 1</a> <div class="box"> <div id="content1"> <h1 align="center">Hide 1</h1> <P> Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Nullam pulvinar, enim ac hendrerit mattis, lorem mauris vestibulum tellus, nec porttitor diam nunc tempor dui. Aenean orci. Sed tempor diam eget tortor. Maecenas quis lorem. Nullam semper. Fusce adipiscing tellus non enim volutpat malesuada. Cras urna. Vivamus massa metus, tempus et, fermentum et, aliquet accumsan, lectus. Maecenas iaculis elit eget ipsum cursus lacinia. Mauris pulvinar.</p> <p><a href="#" id="close">Close</a></p> </div> </div> <a href="#" id="click">Test 2</a> <div class="box"> <div id="content1"> <h1 align="center">Hide 2</h1> <p> Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Nullam pulvinar, enim ac hendrerit mattis, lorem mauris vestibulum tellus, nec porttitor diam nunc tempor dui. Aenean orci. Sed tempor diam eget tortor. Maecenas quis lorem. Nullam semper. Fusce adipiscing tellus non enim volutpat malesuada. Cras urna. Vivamus massa metus, tempus et, fermentum et, aliquet accumsan, lectus. Maecenas iaculis elit eget ipsum cursus lacinia. Mauris pulvinar.</p> <p><a href="#" id="close">Close</a></p> </div> </div> </body> </html>

    Read the article

  • secure synchronization of large amount of data

    - by goncalopp
    I need to automatically mirror a large amount (terabytes) of files in two unix machines over a slow link (1 Mbps). This needs to be done frequently, but the data doesn't change too much (delta transmission doesn't saturate the link). The usual solution would be rsync, but there's an additional requirement: it's undesirable, from a security standpoint, that either the source or destination machines have (keyless) ssh keys to each other, or any kind of filesystem access. All communication between the two machines should thus be initialized (and mediated) through a third machine. I've asked a separate question about rsync in particular here. Are there other obvious solutions I'm missing?

    Read the article

  • Why does storage's performance change at various queue depths?

    - by Mxx
    I'm in the market for a storage upgrade for our servers. I'm looking at benchmarks of various PCIe SSD devices and in comparisons I see that IOPS change at various queue depths. How can that be and why is that happening? The way I understand things is: I have a device with maximum (theoretical) of 100k IOPS. If my workload consistently produces 100,001 IOPS, I'll have a queue depth of 1, am I correct? However, from what I see in benchmarks some devices run slower at lower queue depths, then speedup at depth of 4-64 and then slow down again at even larger depths. Isn't queue depths a property of OS(or perhaps storage controller), so why would that affect IOPS?

    Read the article

  • Windows XP-Physical memory dumping

    - by Raghav Bali
    I have windows XP professional installed on my desktop. It shows up the following errors - Physical memory dumping blue screen. : This aint a new problem, i have been facing this problem ever since i bought this systme. initially the maintainence guy said it was a faulty hard drive and i have got it replaced 3 times already in the past 1 yr. The system gets utterly slow after a usage of around 2-3 months and then these errors crop up and i have to reinstall my windows to keep away these errors. But this time,its been only a week and the blue screen has come up 3 times. What can be the actual cause of the error?? Mine is an assembled machine, its a core 2duo with gigabyte motherboard and a 1 gb ram, 160 gb seagate hdd. please help me its a seriously annoying problem. Edit : A new error recent popprd up, what should i do now??

    Read the article

  • Separating user resources - Windows Server 2008 (Terminal Server)

    - by Christopher Wilson
    At the moment I am running a Windows Terminal Server 2008 for around 10 clients that use the server to run programs and access data. Is there anyway to separate the resources of each user so that they do not impact each other in terms of resources. User 1: Opens program User 2: Notices slow down I have looked into using Windows System Resource Manager but do not know if it provides what I need and if there are any other 3rd party tools that also provide this functionality. Any answer is appreciated. Server Specs: HP ProLiant ML110 G7 Processor: Intel® Xeon® E3-1220 (4 core, 3.1 GHz, 8MB, 80W, 1333/t) RAM: 12GB DDR3 ECC 1TB HDD

    Read the article

  • Driverpack Solution makes a whole bunch of programs

    - by Markasoftware
    I am using Driver Pack solution 12.3 to update my drivers. I used it and installed 55 drivers. I noticed that there are now several dozen new entries in add/remove programs under the name Windows Driver Package - Intel hdc, Windows Driver Package - Intel System, and Windows Driver Package - Intel USB with the driverpack solution logo next to them. When I hit uninstall for any of these, it says all devices using this driver will be removed, and I am scared of uninstalling anything I do not know about. I have no system restore points before I updated my driver. Will these extra driver packages slow down my computer or take up extra disk space? Can I uninstall them safely? I am using Windows 8, so refreshing the computer is an option.

    Read the article

  • Can someone recommend a Compact Flash card to be used as a boot disk/fixed disk.

    - by Hamish Downer
    I have an early Acer Aspire One netbook, and the flash drive is really slow at writing. I've taken it apart to add more RAM, but I've pretty much stopped using it. I've read about people replacing the SSD with a Compact Flash card and a CF to ZIF adapter but I've also read about some Compact Flash cards where the manufacturer has permanently disabled the boot flag to stop people doing this kind of mod. (Can't find the link any more though). (Although I have just found some info about CF cards that says "Most CompactFlash cards by default identify themselves as removable media instead of fixed disk" and that this is an issue for Windows. So my most specific question is: can someone recommend a compact flash card that does allow the boot flag to be set and to be set as Fixed Disk? Please say whether you've done it yourself, or just heard about it from someone else. Beyond that, is this generally a problem?

    Read the article

  • How to speed up rsync?

    - by Jakobud
    I'm running rsync to sync a directory onto my external USB HDD. It's about 150 gigs of data. 50000+ files I would guess. It's running it's first sync at the moment, but its copying files at a rate of only 1-5 MB/s. That seems incredibly slow for a USB 2.0 enclosure. There are no other transfers happening on the drive either. Here are the options I used: rsync -avz --progress /mysourcefolder /mytargetfolder I'm running Ubuntu Server 9.10.

    Read the article

  • Performance affects of compressing Program Files on Windows / NTFS

    - by SRobertJames
    What are the performance affects of compressing Program Files on Windows NTFS? On a fast, multicore machine, the overhead of decompression is minimal. Machines are generally disk bound, and if you can reduce the disk load by compression, you often speed things up. (Microsoft says that the built in compression of Windows Search indexes actually improves speed for this reason.) On the other hand, Windows' virtual memory is complicated. Perhaps if files are compressed, they can't be paged out simply. And there may be other issues. In short: On a fast, multicore machine with a relatively slow disk, what performance affects will compressing Program Files have?

    Read the article

  • Best practice for system clock sync on KVM host

    - by Tauren
    I have an Ubuntu 9.10 server running as a KVM host with ntpd installed on it. The host system has the correct system time. At the moment I only have a single KVM guest, also Ubuntu 9.10 server. I do not have ntpd installed on it, and I just discovered the clock is about 6 minutes slow. It wasn't that way when it was installed about a month ago. I thought that I only needed to keep the host clock synchronized and that the guests used the host clock. But maybe that is a memory from using OpenVZ. I believe the reasoning was related to only the host could modify the physical system clock. Is running ntpd on both the host and all the guests the correct thing to do? Or is there something else that is preferred? How should I keep the guest clocks in sync?

    Read the article

  • How to convert series of MP3 to a M4B in a batch

    - by Artem Tikhomirov
    Hello. I have a batch of MP3 based books. Some of them divide into files according to book's own structure: chapters and so on. Some of them was just divided into equally lengthened parts. So. I've bought an iPhone, and I want to convert them all to M4B format. How could I convert them in a batch? I mean how cold I set up a process once, for each book, and then, after couple of weeks, receive totally converted library. The only able program for such conversion I've found was Audiobook Builder for a Mac. But it is pretty slow and do not support batching in principle. Solutions for any platform, please.

    Read the article

  • Automatic deployment of VNC server to remote terminals (PC's) via Remote Desktop

    - by BradyKelly
    We have several remote, unmanned terminals where I require a VNC server, as using Remote Desktop prevents others using the terminals. Often the connection to one of these is extremely slow, and manually using Remote Desktop to perform the VNC installation is painstaking. What I would like to do is build a package that I could copy onto the remote terminal using Remote Desktop, and then have the package executed to install and configure VNC when the terminal restarts, as they all automatically restart nightly. The terminals are all running Windows XP. Also, out of the many VNC variants out there, which would suit this application?

    Read the article

  • How do I enable greedy MigrationHeuristic in Karmic?

    - by Matthew
    My laptop performs very poorly for 2d graphics. For example, Docky's zoom effect and compiz effects are very choppy. In Jaunty, I was able to fix this by adding the following line under the "Device" section in my xorg.conf: Option "MigrationHeuristic" "greedy" In Karmic, there is no xorg.conf by default, so I copied my old one (from Jaunty). However, everything is still slow. Here is my xorg.conf: Section "Device" Identifier "Configured Video Device" Option "MigrationHeuristic" "greedy" EndSection Section "Monitor" Identifier "Configured Monitor" EndSection Section "Screen" Identifier "Default Screen" Monitor "Configured Monitor" Device "Configured Video Device" EndSection From googling about, it sounds like "MigrationHeuristic" is only an option for the "EXA" mode, while Karmic has switched the intel driver to "UXA". So I tried adding this line under the "Device" section: Option "AccelMethod" "exa" But this didn't help.

    Read the article

  • Linux (Kubuntu 9.10): Strange DNS problem [seems to be IPv6 issue]

    - by Homer J. Simpson
    Hi, I'm experiencing strange problems with my Kubuntu 9.10 when doing DNS requests from various applications. The requests are extremely slow, so loading any pages in Firefox or Konqueror, doing package installations in Kpackagemanager and other apps is really painful, while for example Opera doesnt have any problems, and ping is normally fast as well for DNS pings. I checked the proxy settings of both the used applications as well as of the general system and there are none, so to me it doesn't seem as there was something inbetween.. Does anybody have an idea on what to check for possible problem sources or how to solve this ? I'm behind a DSL home router which does the DHCP (and works well with my other computer). Any kind of advice would be really helpful. Edit: It seems to be some kind of IPv6 problem, as I could get it to work by disabling IPv6 explicitly in Firefox. Is there a general solution to this ?

    Read the article

  • Replace Whole Site by FTP

    - by Sam Machin
    Hi There, I've got a set of tools which periodically (about once a day) generate a complete set of static HTML pages for a site with associated folder structure etc. I then need to put those file onto the production server, my problem is that the server runs IIS(6 I think) and I only have regular FTP access. I need a way to automate the process of publishing the new site and it needs to a total replacement of the files each time its published, eg delete the whole folder & contents then put the new ones up. My source server is a ubuntu machine and I've got total control at that end, I have tried using CurlFTpFS but it seems to be too slow for what I'm trying to do and locks up. Rgds Sam

    Read the article

  • Video acceleration problem with Windows 7 games and PPTX files

    - by Jordan 1GT
    I have a Dell xps M1330 which originally ran Vista, but I upgraded to Windows 7. When I try to run a Win 7 game like spider solitaire I receive the following message: The game is running in software rendering mode. Hardware acceleration is either disabled or not supported by your video card driver which could slow down game performance. Make sure you have the latest video card driver installed and that hardware acceleration is turned on. I confirmed that hardware acceleration is turned on. When I go to Dell's site, I'm told there is no later video driver. When I run the game it runs very choppy. I have a .pptx file which is doing strange things in normal view and I suspect it may be related to the same video acceleration problem.

    Read the article

  • How To Speed Up Adding Column To Large Table In Sql Server

    - by Chris
    I want to add a column to a Sql Server table with about 10M rows. I think this query would eventually finish adding the column I want: alter table T add mycol bit not null default 0 but it's been going for several hours already. Is there any shortcut to get a "not null default 0" column inserted into a large table? Or is this inherently really slow? This is Sql Server 2000. Later on I have to do something similar on Sql Server 2008.

    Read the article

  • Backup Exec 12.5 or 2010? [closed]

    - by Chris Thorpe
    Backup Exec 2010 has just dropped, and I'm about to implement a new BEWS infrastructure, complete with CALs and new central servers. When I specced this up last year, I ignored 2010 and focused on Backup Exec 12.5, since it's a mature product. In previous experience, major released of BE had numerous technical issues and seemed to improve significantly at the first service pack. However, our refresh cycle on the backup infrastructure is slow, the main driver usually being lack of support for some new server type (in this case, ESX has driven our current upgrade need). With this in mind, I'm wondering if Backup Exec 2010 should be my first choice, as it'll last longer under current support than 12.5, which will approach EOL soon. Has anyone got any perspective they could add to this? Right now, I'm leaning towards biting the bullet and going with 2010.

    Read the article

  • Windows Server 2008 R2 slows internet speed

    - by Tone
    I just installed Windows Server 2008 R2 as my main file server on my home network. I've noticed that often times when I start my day my internet connection speed is slow. I'll go to Speakeasy speed test and it'll be at about 25% of its normal speed. When I restart my Server 2008 machine it increases back to normal. It will stay normal until Server 2008 has been running for a while. Any ideas? Edit: I had installed Collabnet Subversion within the past week which installs/sets up some other stuff for web access, I just uninstalled it. I'll report back tomorrow if that fixed my problem.

    Read the article

  • Building optimal custom machine for Sql Server

    - by Chad Grant
    Getting the hardware in the mail any day. Hardware related to my question: x10 15.5k RPM SAS Segate Cheetah's x2 Adaptec 5405 PCIe Raid cards Motherboard has integrated SAS raid. Was thinking I would build 2 RAID 10 arrays one for data and one for logs The remaining 2 drives a RAID 0 for TempDB Will probably throw in a drive for OS. Does putting the Sql Server application / exe's on a raid make a difference and is there any impact of leaving the OS on a relatively slow disk compared to the raid arrays? I have 5/6 DBs combined < 50 gigs. With a relatively good / constant load. Estimating 60-7% reads vs writes. Planning on using log shipping as well if that matters. Any advice or suggestions?

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >