Search Results

Search found 6231 results on 250 pages for 'slow diver'.

Page 144/250 | < Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >

  • For a particular domain, how can I cache its JSON responses locally?

    - by Chris
    I'm coding the frontend of a web app that uses XHR to grab JSON data from a 3rd party. The 3rd party service is slow and because of its API design, we need to make a LOT of API requests every time I refresh the page to test some new code. It's making the development loop painful. The requests are GETs, POSTs and PUTs even though I'm pretty sure none of the requests are changing state. I want to go to localhost for the JSON rather than to this 3rd party API - simply to make my development process faster.

    Read the article

  • Tunneling through SSH for 1521 port access?

    - by A T
    I am developing locally on my computer, using my own Apache server with PHP configured. My database however is remotely located on an Oracle 11g Database Server. We were also given a separate remote server for hosting our .html and .php files, however only FTP access has been provided there. Development is far too slow waiting for the FTP connection to push. So I decided to develop locally, but still use the remote DB server. Unfortunately that gives me an error. Not sure how—or where—to integrate tunnelling. Do I add something to the oci_connect HOST in my PHP file, or do I encapsulate my whole environment over SSH?

    Read the article

  • How can I turn off flash fill automatically in Excel 2013?

    - by user3480643
    Flash fill breaks a lot of things in older excel documents. It causes maddeningly slow transfers from cell to cell after updating. I am trying to find a way to turn off "flash fill" in Excel 2013 automatically before rolling the product out to the rest of the staff in my company. Is there (preferably) a registry key that I can apply or a switch that I can include during the install that will turn this option off? Here is an image of the setting that I am looking to turn off: I haven't been able to find any documentation online about turning this off, other than this one page from MS: http://office.microsoft.com/en-ie/excel-help/turn-flash-fill-on-HA104043292.aspx

    Read the article

  • Can a mapped network drive be reconnected from the command line?

    - by Stephen Jennings
    On a daily basis I find myself in the Windows command prompt needing to access a network drive that is mapped but disconnected. I have yet to find a command that will reconnect this drive without unmapping and remapping (which leads to a password guessing game, since I don't own these computers). I would also like to be able to script this so every night the drive is reconnected if it has become disconnected somehow. The fastest solution I currently have is to: Type "start." to open explorer, Alt-D to focus the address bar, type the drive letter I want and press enter, and wait for it to display the drive contents, then finally, close explorer and go back to the command prompt. I know it's a minor inconvenience, but I'm often doing this through a slow VNC or PCAnywhere connection where doing anything through GUI is awful, so I'm just wondering if there's a better solution.

    Read the article

  • Keeping folder of files in sync over 3 machines

    - by Wizzard
    Morning, Got 3 machines that have user content on them, which I need to keep in sync. This is a 3 way sync. Currently I run rsync but we just don't handle deletes. Have looked at something like gluster, but that seems a little over the top Any other software out there to do a 3 way sync, or a good network file system...? There is for web servers so we don't want a slow / IO hungry process. 3 servers... user content could be added to 1 and needs to be moved to other two.

    Read the article

  • How to calculate proper amount of inode/block sizes for a linux filesystem.

    - by Donatello
    I have an old reiser filesystem which I'm going to convert to Ext3. The problem I have is to determine the proper block- and inode-sizes for this partition. The partition is 44 GB large and has to hold 3,000,000+ files of sizes between 1 kb and 10kb, how can I figure out the best ratio of inodes and blocksize? The below is something I tried which seems OK but makes the copying files incredibly slow. mkfs.ext3 -t ext3 -c -c -b 1024 -i 4096 -I 128 -v -j -O sparse_super,filetype,has_journal /dev/sdb1 Thanks.

    Read the article

  • How do I stop my IIS App Pool making a request to wpad.mydomain.com?

    - by Programming Hero
    As part of some performance troubleshooting, I've monitored the slow startup of a "cold" App Pool (one without an active worker process) in IIS. When using a built-in account, the App Pool starts in sub-second time. When using a custom local account the App Pool takes 30+ seconds to start processing requests. The service appears to be making requests to wpad.mydomain.com, an address it does not have access to, which causes it to wait 30 seconds for a response before eventually timing out. As a workaround, I've added the hostname to the server's hosts file, to direct the traffic to the local machine, which returns much faster (1-2 seconds). What do I need to do to stop IIS making this request when this identity is used for the App Pool?

    Read the article

  • Transparent, unicode X terminal not tied to a Desktop Environment?

    - by jamuraa
    I've been looking for this for a while now and I just haven't been able to find one. The last few that I used were: aterm - this one was fast and had good transparency support, but it doesn't support Unicode at all as far as I can tell. The dependency graph is also reasonable. gnome-terminal - was good, and had good transparency support plus unicode, but it pulls in about everything in gnome, and I don't use anything else in gnome. It was also somewhat slow (noticable lag in updating at times) and wouldn't use fonts that I wanted. Eterm - same thing as aterm, good dependencies and transparency but no unicode. Does anyone have suggestions, or will I be stuck with gnome-terminal's dependencies and slowness?

    Read the article

  • Syncing 1TB+ to a iSCSI device, software needed

    - by mojah
    Hi, I need to sync a local disk to a iSCSI mount on Windows (server 2003), and I'm struggling to find software that's capable of doing so in a reasonable timeframe. Notes on the current 1TB disk: - 800GB currently in use - Contains a folder with several hundred thousand subfolders, which in turn have several thousand files, ... So I'm trying to find a piece of software that can handle such large filelists, and give me a good timeframe on when this will be copied. I've tried DeltaCopy (the rSync GUI client for Windows), but it's intolerably slow and doesn't provide me with a good estimate time remaining. DeltaCopy: http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp Does anyone know alternative software for Windows, that would do this well?

    Read the article

  • How can I speed up my macro in Excel 2003?

    - by user144872
    I have a macro that copies data from one cell to another and uses a VLOOKUP formula, among other things. My spreadsheet contains nearly 2000 rows. When I run it in Excel 2003, Excel starts to slow down as the macro processes rows 500 and above. It gets even worse when it reaches the 1000th row. It takes more than 5 hours to complete. In Excel 2007, however, the macro runs for only half an hour. Can anyone help me find a good solution?

    Read the article

  • Searching Multiple Terms

    - by nevets1219
    I know that grep -E 'termA|termB' files allows me to search multiple files for termA OR termB. What I would like to do instead is search for termA AND termB. They do not have to be on the same line as long as the two terms exists within the same file. Essentially a "search within result" feature. I know I can pipe the results of one grep into another but that seems slow when going over many files. grep -l "termA" * | xargs grep -l "termB" | xargs grep -E -H -n --color "termA|termB" Hopefully the above isn't the only way to do this. It would be extra nice if this could work on Windows (have cygwin) and Linux. I don't mind installing a tool to perform this task.

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Installed Ubuntu on usb drive, now it's read-only and can't be formatted

    - by weiszam
    I installed Ubuntu (ubuntu-13.04-desktop-amd64) on a 8 gb pendrive. Because it worked pretty slow, I wanted to change it back to a normal pendrive. But this is what happened. Windows7 can't open the usb drive, can't view files or anything, can't access attributes. When I select to format, it says it can't format it because it is write-protected. Tried the same running from an Ubuntu, and trying from a booted GParted thing. When I view the partitions, they can't be deleted either. What should I do to get it formatted?

    Read the article

  • Unexpected "waiting for localhost"?

    - by Tenaar
    So I ran into something that kind of worried me today. Lately my computer has been kind of slow and I'm dealing with that, but today when I opened Facebook in Google Chrome, I noticed a message in the bottom left corner while it was loading the site that said "Waiting for localhost". It was brief and I managed to notice it because my computer is slower than it used to & it caused Chrome to hang briefly, long enough for me to read it. As I'm quite confident in that Facebook isn't running on my localhost, I'm wondering what could potentially make Chrome wait for localhost while I'm loading webpages from external servers. Is there a malware of some kind that I should be worrying about? Unfortunately I have no other information than this to go on, and I have no idea how to further investigate this, if it generated any logs or whatever. I'd appreciate any help in figuring out this matter!

    Read the article

  • Default profile for large

    - by user63434
    Hi I am setting up a master image to clone to all same machine type Windows 7 client, I login as administrastor and installed all the programs and changed the desktop settings etc, but my local administrator profile is 244megs in size, which will become the default profile of the local machine when sysprep, we have a 2003 server that I want to use mandatory profile for all login users which means I need to copy this profile to the server so when any users login to the domain they are using this profile, loading a 244megs profile is going to be very slow since it will be removed from the client when they logoff. So next time they login it will take a long time again. Is there anything I can do, can I just copy just the bare minimum files from the default profile to the server, as I am not sure what parts I need, I read that I must copy my documents, my documents/pictures so the folder redirection will work. What else do I need to copy to the server? I have firefox xmark sync also and MS words etc. THanks

    Read the article

  • Specific apache + mysql settings for a light-weight site

    - by Good Person
    I have a small website with a Joomla and a Moodle set up. It seems that both of these are very slow. The server (CentOS release 5.5 (Final)) is a virtual dedicated server with about 2GB of ram. I don't expect to ever get more than 10-15 people on at the same time (and if that is high) What settings could I change in either apache, mysql, or even the OS to increase the performance of my site? I'm not concerned about running out of resources if I get too many visitors. If you need more specific data leave a comment and I'll edit the question.

    Read the article

  • Generating strongly biased radom numbers for tests

    - by nobody
    I want to run tests with randomized inputs and need to generate 'sensible' random numbers, that is, numbers that match good enough to pass the tested function's preconditions, but hopefully wreak havoc deeper inside its code. math.random() (I'm using Lua) produces uniformly distributed random numbers. Scaling these up will give far more big numbers than small numbers, and there will be very few integers. I would like to skew the random numbers (or generate new ones using the old function as a randomness source) in a way that strongly favors 'simple' numbers, but will still cover the whole range, I.e. extending up to positive/negative infinity (or ±1e309 for double). This means: numbers up to, say, ten should be most common, integers should be more common than fractions, numbers ending in 0.5 should be the most common fractions, followed by 0.25 and 0.75; then 0.125, and so on. A different description: Fix a base probability x such that probabilities will sum to one and define the probability of a number n as xk where k is the generation in which n is constructed as a surreal number1. That assigns x to 0, x2 to -1 and +1, x3 to -2, -1/2, +1/2 and +2, and so on. This gives a nice description of something close to what I want (it skews a bit too much), but is near-unusable for computing random numbers. The resulting distribution is nowhere continuous (it's fractal!), I'm not sure how to determine the base probability x (I think for infinite precision it would be zero), and computing numbers based on this by iteration is awfully slow (spending near-infinite time to construct large numbers). Does anyone know of a simple approximation that, given a uniformly distributed randomness source, produces random numbers very roughly distributed as described above? I would like to run thousands of randomized tests, quantity/speed is more important than quality. Still, better numbers mean less inputs get rejected. Lua has a JIT, so performance can't be reasonably predicted. Jumps based on randomness will break every prediction, and many calls to math.random() will be slow, too. This means a closed formula will be better than an iterative or recursive one. 1 Wikipedia has an article on surreal numbers, with a nice picture. A surreal number is a pair of two surreal numbers, i.e. x := {n|m}, and its value is the number in the middle of the pair, i.e. (for finite numbers) {n|m} = (n+m)/2 (as rational). If one side of the pair is empty, that's interpreted as increment (or decrement, if right is empty) by one. If both sides are empty, that's zero. Initially, there are no numbers, so the only number one can build is 0 := { | }. In generation two one can build numbers {0| } =: 1 and { |0} =: -1, in three we get {1| } =: 2, {|1} =: -2, {0|1} =: 1/2 and {-1|0} =: -1/2 (plus some more complex representations of known numbers, e.g. {-1|1} ? 0). Note that e.g. 1/3 is never generated by finite numbers because it is an infinite fraction – the same goes for floats, 1/3 is never represented exactly.

    Read the article

  • How Can I Speed Up Terminal.app or iTerm on Mac OSX?

    - by pmaiorana
    Every time I launch either iTerm or Terminal after not using it for a few hours, it takes anywhere from 10-20 seconds to return a prompt. The screen is blank, and although I can type I can't actually run any commands. If I quit either application, subsequent launches (if done relatively soon thereafter) are quite fast. The slowness only seems to occur if the app wasn't running for a few hours. I'm running OSX 10.5.7 on a MacBookPro. I have the exact same setup on another computer, with no slow downs. Any ideas how to speed things up again?

    Read the article

  • Queue emails under linux

    - by md1337
    I have a slow distant mail relay server and a web application I'm using locks up when sending e-mails to that distant mail server, until the e-mail is sent. After the e-mail is sent the page comes back and the application is snappy again. SO I'm trying to set up a differed mail queue locally on the application server (linux) so that the application uses that instead of the distant mail server. My rationale is that e-mails would get queued up locally until they are processed by the distant mail server, but at least the application doesn't lock up. I have installed postfix and set up the relayhost setting to the distant mail server, but performance has not improved. What appears to happen is that postfix just forwards my SMTP instructions in real time and doesn't really queue them? What can I do? Thanks!

    Read the article

  • Which OS should I boot into for virtualization?

    - by acidzombie24
    This might be a silly question. I use windows 7 99% of the time. I run linux 10% of the time and XP 5% of the time. I am thinking about getting a Intel® Core™ i7-2600 Processor which has hardware support for virtualization. I dont think i want more than one partition. May have a swap partition. Which OS should I make my primary (and only) partition? I suspect windows7 if i am always using it as going through a linux layer would slow it down. Does it matter much which OS i use if i have hardware support for virtualization? At the moment I am using VMWare player. I suspect software doesnt effect performance?

    Read the article

  • Optimizing JS Array Search

    - by The.Anti.9
    I am working on a Browser-based media player which is written almost entirely in HTML 5 and JavaScript. The backend is written in PHP but it has one function which is to fill the playlist on the initial load. And the rest is all JS. There is a search bar that refines the playlist. I want it to refine as the person is typing, like most media players do. The only problem with this is that it is very slow and laggy as there are about 1000 songs in the whole program and there is likely to be more as time goes on. The original playlist load is an ajax call to a PHP page that returns the results as JSON. Each item has 4 attirbutes: artist album file url I then loop through each object and add it to an array called playlist. At the end of the looping a copy of playlist is created, backup. This is so that I can refine the playlist variable when people refine their search, but still repopulated it from backup without making another server request. The method refine() is called when the user types a key into the searchbox. It flushes playlist and searches through each property (not including url) of each object in the backup array for a match in the string. If there is a match in any of the properties, it appends the information to a table that displays the playlist, and adds it to the object to playlist for access by the actual player. Code for the refine() method: function refine() { $('#loadinggif').show(); $('#library').html("<table id='libtable'><tr><th>Artist</th><th>Album</th><th>File</th><th>&nbsp;</th></tr></table>"); playlist = []; for (var j = 0; j < backup.length; j++) { var sfile = new String(backup[j].file); var salbum = new String(backup[j].album); var sartist = new String(backup[j].artist); if (sfile.toLowerCase().search($('#search').val().toLowerCase()) !== -1 || salbum.toLowerCase().search($('#search').val().toLowerCase()) !== -1 || sartist.toLowerCase().search($('#search').val().toLowerCase()) !== -1) { playlist.push(backup[j]); num = playlist.length-1; $("<tr></tr>").html("<td>" + num + "</td><td>" + sartist + "</td><td>" + salbum + "</td><td>" + sfile + "</td><td><a href='#' onclick='setplay(" + num +");'>Play</a></td>").appendTo('#libtable'); } } $('#loadinggif').hide(); } As I said before, for the first couple of letters typed, this is very slow and laggy. I am looking for ways to refine this to make it much faster and more smooth.

    Read the article

  • Could 11.5 Million 401's be causing bottlenecks?

    - by roviuser
    I'm going to preface this with a warning: My knowledge about servers and networking is VERY limited, and if you provide me with technical answers, I probably won't understand much until I research your answer further. I'm trying to expand my knowledge and learn about it, though. If the information that I am able to provide in this question is insufficient to answer the question, I understand, and it can be closed. We have a SharePoint 2007 system that is extremely slow, mostly from huge amounts of use. We've been told that the main speed bottleneck is the access to the sql databases. However, they do provide a statistics dashboard, so I did some poking around, and noticed that we have 11.5 million or more 401 - access denied errors every month. Could this be causing major speed/performance decreases? Authentication for sharepoint uses active directory.

    Read the article

  • Do control groups improve system performances?

    - by qdii
    According to this website, enabling cgroups in the kernel can boost performances by sharing resources in a better way. In particular, the conclusion states that:  Nevertheless, with a little trial and error, cgroups can help you improve the efficiency of your systems’ resource usage and avoid downtime due to overusage of a single service. Kernel seeds, however, recommend to deactivate them altogether. They say: Consider these [kernel] settings poison. They remain nothing but system slow-downs. They are all off by default [in the proposed kernel config file]. Who should I trust?

    Read the article

  • Why are browsers so heavy?

    - by Kaivosukeltaja
    Back in 1998 I had a computer with 233MHz Pentium MMX CPU and a GFX card with no 3D acceleration. It was able to run games like Quake II at a decent FPS rate. My current computer has tons more performance and a mid-class GPU, yet struggles to reach 20 FPS when rendering a single model inside a skybox with WebGL. Even regular pages with lots of 2D CSS animations bring many modern computers to their metaphorical knees. As a web developer I understand there's a lot going on in a web page but not what makes it that heavy. Modern browsers compile JavaScript to CPU native machine code before running it and rendering into a canvas element shouldn't trigger DOM rebuilds so theoretically it should be a lot faster than it is. What am I missing here and is it possible to avoid or minimize whatever is making the browsers slow to build more efficient websites?

    Read the article

  • Wifi Signal Strengths

    - by Phorce
    I hope that I have asked this question in the right topic. Basically I use BT (In the U.K.) and I have problems with my router receiving a strong signal in certain rooms of the house, which, therefore makes the internet really slow in some cases it won't work at all. I have had experience working with Virgin Media (UK) before and just changed the channel number and this worked fine, but, do not know if I would receive the same outcome on BT. Could anyone suggest any other things that I could try alongside of the channel change in order to /hopefully/ get a stronger signal, without having to install Ethernet cable?

    Read the article

< Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >