Search Results

Search found 6231 results on 250 pages for 'slow diver'.

Page 129/250 | < Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >

  • Where should executables be run from? Network share or client?

    - by user74150
    We have an executable that is used by 50+ client machines on a network and upgraded regularly. Is it acceptable to put the executable on a network share and have the client machines run it from there via a shortcut on their desktop? That way when we upgrade the .exe we can simply replace the one file with a new one and all clients will be accessing the new one. How will a slow or unstable network handle this? If this is not acceptable, what would be the best way to keep all clients updated with the latest .exe?

    Read the article

  • How to run sed on over 10 million files in a directory?

    - by Sandro
    I have a directory that has 10144911 files in it. So far I've tried the following: for f in ls; do sed -i -e 's/blah/blee/g' $f; done Crashed my shell, the ls is in a tilda but i can't figure out how to make one. ls | xargs -0 sed -i -e 's/blah/blee/g' Too many args for sed find . -name "*.txt" -exec sed -i -e 's/blah/blee/g' {} \; Couldn't fork any more no more memory Any other ideas on how to create this kind command? The files don't need to communicate with each other. ls | wc -l seems to work (very slow) so it must be possible.

    Read the article

  • Large File Uploads? SWFUpload?

    - by Ethabelle
    So, we offer video services and have run into an issue with people uploaded large file sources. I realized that our developer was utilizing php HTTP uploads to handle this and that was causing the slow times & breakdowns. Now, they keep coming at me wanting to use SWFUpload, quoting it is utilized by YouTube, but I'm adamantly against it because -- well, flash. However, I don't really know of a -better- solution that works across all browsers. So I was wondering SWFUpload, which hasn't been updated in a year, is really the viable solution?

    Read the article

  • Is there any way to make Mac OS X Spotlight only index the file names and not the contents?

    - by aalaap
    I do understand that the point of Spotlight is to look inside files, but it also returns file name matches, and that's what I need most of the time. Besides, Spotlight is running so absurdly slow on my system (Snow Leopard on the iMac '08), it's just unusable. I downloaded Canary and Spotlight wasn't able to find the app file for 15 minutes. It was already in the download stack, but as far as Spotlight goes, the file doesn't exist. Hence, I would like to know of a way to make Spotlight only index the file names, which would perhaps make it a bit faster. I'm looking at mimicking the behaviour of Windows applications such as AvaFind or Search Everything Edit: Let me highlight the fact that I am looking for an AvaFind or Search Everything replacement for Mac OS X. Go try one of these on a Windows machine and you'll understand my disappointment with Spotlight or any other popular search tools in OS X.

    Read the article

  • How can I speed up my Windows Server 2008 VPN Connection?

    - by Pure.Krome
    So I've installed VPN service that comes with Windows Server 2008. Works perfectly, etc. When a client remote desktops to one of the private servers at the office, via VPN .. it's pretty slow. Now - how long is a piece of string? So before I get all the obligatory checks, I'll list the things from Mr. Obvious: Our modem/router (fritz!box) has a data/graph that shows incoming and outbound bandwidth. Both directions are barely getting used when a client has RDP'd via VPN. Our office internet connection is running at 21,9 Mbit/s download 1,3 Mbit/s upload. I feel like it's maxing at .. modem speeds ?? Is there any tricks I can do to confirm this and possibly even fix this?

    Read the article

  • Maximum number of files in one ext3 directory while still getting acceptable performance?

    - by knorv
    I have an application writing to an ext3 directory which over time has grown to roughly three million files. Needless to say, reading the file listing of this directory is unbearably slow. I don't blame ext3. The proper solution would have been to let the application code write to sub-directories such as ./a/b/c/abc.ext rather than using only ./abc.ext. I'm changing to such a sub-directory structure and my question is simply: roughly how many files should I expect to store in one ext3 directory while still getting acceptable performance? What's your experience? Or in other words; assuming that I need to store three million files in the structure, how many levels deep should the ./a/b/c/abc.ext structure be? Obviously this is a question that cannot be answered exactly, but I'm looking for a ball park estimate.

    Read the article

  • Upgrading my home network to Gigabit Ethernet and Wireless-N turns out slower than before

    - by Raheel Khan
    My home network has three desktops, three laptops and some NAS drives. All desktops and NAS drives support Gigabit LAN and all laptops support Wireless-N. I was running a 100 BaseT switch though. I recently purchased a Gigabit Ethernet Switch and an Wireless-N ADSL Modem-Router. After upgrading, I noticed that the wireless file transfer speeds from laptop-to-NAS and vice versa became terribly slow. Possibly even slower than before the upgrade. The transfer speeds from desktop-to-NAS (wired) have improved though. As an example, copying a 50GB file from laptop-to-NAS was estimated at 15 hours! Is there something I can do to improve this? Also, should I consider buying a dedicated wireless access point for speed rather than using the Wireless modem-router?

    Read the article

  • Send documents to printer without waiting for Vista to handle queue

    - by Greenleader
    I got a print server on our old printer. Vista has its own queue which presents a problem. I want to bypass this queue and send everything straight away to the printer so the print server deals with the queue and not Vista. Problem is when a second document is being printed from the same computer after first one. Vista is still waiting for info on finishing the first job even 5 minutes after it was REALLY finished. How do I get it so that I can send straight to the print server and not have Vista slow things down by trying to handle the queue itself?

    Read the article

  • How to test web application performance from other continent?

    - by Thomas Einwaller
    We are hosting our web application http://timr.com on a server located in Germany. The server handles a high load of traffic very well and everything works as desired in terms of performance and load times. However we sometimes get complaints from our overseas users (US, South America) that the experience slow page loading times. What would be the best way to test the performance of a web application "as if you are on another continent"? I want to make sure that the distance between the server and the user is no problem?

    Read the article

  • Which is better for running Ubuntu and other Linux OSes, Chromebook or Windows, why? [on hold]

    - by Serge
    I'm learning programming and I would like to switch to a Linux OS, perhaps Ubuntu, to continue this, but the current machine is generally getting pretty old and slow and Windows is the least favorite option for production, and I can manage getting something new right around the price range of the nicest Chromebook on the market right now. However, I have compared specs of HP Chromebook 14 with those of regular PC laptops that roughly cost the same, and the latter consistently have approximately the same and sometimes higher (like the processor speed) specs. Yet usage of Chromebooks for this purpose is pretty widespread nowadays. Is this because they were initially built for a Linux OS - and is it really THAT crucial - or are there other major factors at play here?

    Read the article

  • Hyper-V core NIC speeds and registry changes

    - by gary
    Good afternoon, On a Dell PE T610 I have Hyper-V core running, with 2 x Broadcom BCM5709C NetXtreme II GigE installed. I have noticed that copying large files 17GB for example, from a network physical server to the Hyper-V host local drive [not vm guest] is very slow in comparison to copying from Physical to Physical servers. Copying a 17GB file physical to Hyper-V host takes 30 minutes Copying a 17GB file physical to physical host takes 15 minutes Can someone tell me exactly what registry nodes I should disable on Hyper-V NICs to improve performance. So far I have gone to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class{4 D36E972-E325-11CE-BFC1-08002BE10318} and set the following to 0 on both physical NICs: *LSOv1IPv4 *LSOv2IPv6 *TCPUDPChecksumOffloadIPv4 *TCPUDPChecksumOffloadIPv6 Should I also disable *TCPConnectionOffloadIPv4 & *TCPConnectionOffloadIPv6? Many thanks in advance

    Read the article

  • Password Manager that can sync a Blackberry and Mac OS X.

    - by pdhoven
    I use a Blackberry Bold and a Macbook Pro. I am looking for a solution to have a synchronized password manager between the two devices. All the commercial ones I have discovered won't work between a Blackberry and a Mac. The almost solution was KeePass. I like the application on the Blackberry but I could not get the sync working reliably to the Mac. As well, I had to run the PC application by using Mono on the Mac and it was pretty slow. I am happy to pay for a good solution.

    Read the article

  • Ignoring GET parameters in Varnish VCL

    - by JamesHarrison
    Okay: I've got a site set up which has some APIs we expose to developers, which are in the format /api/item.xml?type_ids=34,35,37&region_ids=1000002,1000003&key=SOMERANDOMALPHANUM In this URI, type_ids is always set, region_ids and key are optional. The important thing to note is that the key variable does not affect the content of the response. It is used for internal tracking of requests so we can identify people who make slow or otherwise unwanted requests. In Varnish, we have a VCL like this: if (req.http.host ~ "the-site-in-question.com") { if (req.url ~ "^/api/.+\.xml") { unset req.http.cookie; } } We just strip cookies out and let the backend do the rest as far as times are concerned (this is a hackaround since Rails/authlogic sends session cookies with API responses). At present though, any distinct developers are basically hitting different caches since &key=SOMEALPHANUM is considered as part of the Varnish hash for storage. This is obviously not a great solution and I'm trying to work out how to tell Varnish to ignore that part of the URI.

    Read the article

  • Are Motherboards for the Acer Aspire One AOA150 Netbook Compatible with the AOA110?

    - by Mindstormscreator
    I have an Acer Aspire One ZG5 AOA110-1588 netbook, and the motherboard doesn't have a port for a SATA 2.5 inch hard drive; it only supports this slow 8GB SSD type drive. Through research I've discovered that the AOA150 motherboards do have a SATA slot, and the bottom plate of these laptops have an appropriate protrusion for the drive to fit in (for example, compare this to this). The AOA110 and AOA150 models are very similar in appearance and specs. I've even seen tutorials that involve soldering a SATA connector onto the AOA110's motherboard, essentially creating an AOA150 motherboard (right?) So, could I just swap out the motherboard in my netbook with the MBS0506001? (I'd post another link to the actual board but can't because of the spam prevention...) I assume I would also need to purchase and replace the bottom cover with a larger one and possibly get a hard drive caddy as well...? Thanks!

    Read the article

  • Secure external connection to SQL Server (from third party software)

    - by Bart
    I have a SQL Express 2008 R2 server running on a server in an internal lan network. A few databases are used by some third party software to store data. A SQL-Server user is used by this application to connect to the database. Now I need to access this database using a local installation of the software from an external pc. In this particular case a VPN connection is not the solution I am looking for. I have access to an external linux server, so I tried ssh tunneling from the windows server to the linux server and use the external pc to tunnel it back from the linux server to the client, but this is working very very slow. What are my other options to allow this external connection in a safe way?

    Read the article

  • Troubleshooting source of heavy resource-usage on a windows server 2008 running multiple sites

    - by batman_man
    Hi, I am running about 10 asp.net websites on a hosted virtual server. The server runs Server 2008 - each website is backed by its own database running on SQL server 2008 on the same box. Lately the box has seemed really slow. The only kind of discovery i could think of doing was looking in the task manager, where i can see w3wp and sqlserver.exe jumping to 40% cpu usage every 5-10 seconds. What are the steps i can take to determine which of my websites is taking these resources and or what database is getting hit the most? I have of course ssms installed on the machine as well. As you can tell, my sysadmin skills are very very limited - any help would be much appreciated.

    Read the article

  • In gmail can we make is:unread the default view?

    - by JK-
    In gmail can we make is:unread the default view? That is, when we load gmail can we configure it so that is:unread is already selected, thereby showing only unread messages by default? Or can we at least configure gmail so that it has a 1 click link that only shows unread mail? The reason for this of course is that it is slow and inefficient to have to type is:unread into the gmail search every time I open it. It would be much more convenient for gmail to load this view by itself when I open gmail.

    Read the article

  • Blocking internet poker applications

    - by Matthew Savage
    I 'look after' the wireless internet for a cafe where I live, and we've noticed that there's quite a substantial slow down of internet speeds when certain users are playing internet poker. I've put in filters to block any HTTP traffic referencing gambling and poker etc, however I want to be able to block any applications (i.e. poker clients) which don't use HTTP. I've tried searching around for a list of poker clients and perhaps their ports, but have had no real luck. Does anyone know what these might be?

    Read the article

  • How much processor speed and cores do I need for these tasks?

    - by ajay
    I am planning to buy a new laptop as I find my current one very slow. My question here is specifically related to RAM size and CPU power. I will mostly be doing development (not much games). I would be dabbling in distributed computing, multithreaded and data intensive parallelizable tasks on multi-cores. For e.g. I would want to be able to Concurrent programming in Scala/Java/Clojure etc. and be able to see parallelization. Furthermore, I would want the RAM to be enough. But from a developer machine standpoint, do you think 4GB RAM and 2.53GHz Dual Core processor would be enough. I'm basically looking at this model: http://store.apple.com/us/configure/MC118LL/A?mco=MTM3NDcyODk (link dead)

    Read the article

  • site timing out when under heavy load

    - by naunu
    My client sends out eblasts at 8am monday/wed/friday. Between 8:15-8:45 the site becomes extremely slow and many users sessions timeout. My setup: Mediatemple VE 2gb dedicated ram (3 burst) Ubuntu 9.10 Apache2-mpm-worker PHP5.3-fcgi MySQL 5 I recently tried to remedy the problem by switching from apache2-mpm-prefork to mpm-worker, but am still having the same issues. My apache settings are: Timeout 100 KeepAlive On MaxKeepAliveRequests 100 <IfModule mpm_worker_module> StartServers 12 MinSpareThreads 25 MaxSpareThreads 96 ThreadLimit 96 ThreadsPerChild 25 MaxClients 225 MaxRequestsPerChild 0 </IfModule> The site is only getting ~10,000 page views during the 8am-9am hour, which I dont think should be stressing the server too badly. Maybe it is an error with the PHP settings, or bandwidth per unit time, or the site outgrew the server? Any suggestions would be very helpful - as you can see i've given it a good go before looking for help (installed mpm-worker). Also, can anyone suggest to me some free load testing software, or a tutorial on mod_status? Thank you

    Read the article

  • Connecting Windows XP to Windows 7 directly using cable

    - by TPR
    These are the problems I am encountering. XP can access Windows 7, not the other way around (which is fine, because I don't need it the other way currently) File transfer is too slow like 0.031 MB/s even though netperf and netCPS list around 8-9 MB/s. I disabled firewall on both computers. Both are same workgroup. I left homegroup on Windows 7. Windows 7 sees the connection as unidentified network. 10.1.1.2 (XP) and 10.1.1.1 (Windows 7) Subnet mask 255.255.255.0 Default gateway and DNS are empty for both of them. Both computer are connected to internet using wireless (using home network), and both of them are connected to each other using wire! If anybody has any pointers, do let me know. I have no problem doing such setup with both computers being Windows 7. This time one of them is XP though, and that seems to be the problem.

    Read the article

  • Can someone recommend a Compact Flash card to be used as a boot disk

    - by Hamish Downer
    I have an early Acer Aspire One netbook, and the flash drive is really slow at writing. I've taken it apart to add more RAM, but I've pretty much stopped using it. I've read about people replacing the SSD with a Compact Flash card and a CF to ZIF adapter but I've also read about some Compact Flash cards where the manufacturer has permanently disabled the boot flag to stop people doing this kind of mod. (Can't find the link any more though). So my most specific question is: can someone recommend a compact flash card that does allow the boot flag to be set? Please say whether you've done it yourself, or just heard about it from someone else. Beyond that, is this generally a problem?

    Read the article

  • Making a Live Thumb drive boot with Persistent files, settings AND *drivers* that load on boot?

    - by Luke Stanley
    I have seen https://wiki.ubuntu.com/LiveUsbPendrivePersistent but it's a mess. What methods support persistent drivers as well as files and settings and don't screw up lifespan of the flash drive? I'd like to see your personal recommendations on say, Portable Linux, USB Creator, Remastersys + Unetbootin, etc Backstory: I have a Inspiron 1525 that's hard drive has been slowly dying. I want to switch to a live USB/CD/DVD system until I can get it repaired but my laptops internal wifi device requires a network connection by another means for Xubuntu to let it work, and then I have to enter my Wifi key again, and THEN I have to reinstall Skype etc... I'd be damned every time I have to shut the laptop down. I'm ok with making a shell script for installing apps and copying settings as required but a good persistent install should make this old hat and slow and it doesn't take care of drivers. The last time I tried making an ISO with Remastersys it didn't seem to copy all the required settings.

    Read the article

  • Performance difference between MacBook Pro (2.8 GHz) vs Air (1.7 GHz)?

    - by jonathanconway
    I'm comparing these two Apple laptops: MacBook Pro (13", 2011 model): 2.8GHz dual-core Intel Core i7 processor with 4MB shared L3 cache 4GB (two 2GB SO-DIMMs) of 1333MHz DDR3 SDRAM AMD Radeon HD 6770M graphics processor with 1GB of GDDR5 memory on 2.4GHz configuration MacBook Air (13", 2011 model): 1.7GHz dual-core Intel Core i5 with 3MB shared L3 cache 4GB of 1333MHz DDR3 onboard memory Intel HD Graphics 3000 processor with 384MB of DDR3 SDRAM shared with main memory There's definitely a gap between them in terms of CPU speed and graphics, but what practical difference would this make on a day-to-day basis? On the one hand, I love the sleek, thin appearance of the Air. On the other hand, I don't want a machine that's going to be dog-slow when doing tasks such as running Virtual Machines, dual-booting to Windows and running multiple instances of Visual Studio, and maybe some light gaming. Is there going to be a major difference that makes the MacBook Pro a more attractive purchase?

    Read the article

  • Best Solution for Load Balancing geographically distributed NFS File Access?

    - by DairyKnight
    I'm trying to find an optimum solution for accessing the NFS file share in my company. We have a central file server in North America and has 30GB~50GB of updated data everyday. And it's very slow for our Europe and Asia branches to access directly. Therefore, I'm trying to setup two replicate servers in those continents. I'm currently using rsync, but wonder if there exists a better solution acts more like a distributed RAID, which allows the user to transparently access the file whether synced or not. And user request will be dispatched to remote server if the file is not yet synced. I'm now looking into DRBD, but it seems not to have the functionality of auto-dispatching requests. Does anyone know if there's a better solution?

    Read the article

< Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >