Search Results

Search found 3747 results on 150 pages for '500'.

Page 57/150 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • How to remove class Conditionally ?

    - by Wazdesign
    $("li").hoverIntent({ sensitivity: 3, interval: 200, over: addOver, timeout: 500, out: removeOver }); function addOver(){ $(this).addClass('over').children('a:first').addClass('active');} function removeOver(){ $(this).removeClass('over').children('a:first').removeClass('active');} alert ('version 2 menu'); }); On mouseout I want to remove class of " li a" if it already dont have class. I want to remove the class of the children.

    Read the article

  • how to write a script that logs into an application and checks a page

    - by josh
    Is it possible to write a script that will login to an application using uname/pwd? the username/password are not passed in through POST (they dont come in the URL) Basic steps I am looking for are: Visit url enter uname/pwd click a button click a link get the raw html to make sure it does not have 500 error Is that possible to do in any language? Please point me to some examples as well

    Read the article

  • string combinations

    - by vbNewbie
    I would like to generate a combination of words. For example if I had the following list: {cat, dog, horse, ape, hen, mouse} then the result would be n(n-1)/2 cat dog horse ape hen mouse (cat dog) (dog horse) (horse ape) (ape hen) (hen mouse) (cat dog horse) (dog horse ape) (horse ape hen) etc Hope this makes sense...everything I found involves permutations The list I have would be a 500 long

    Read the article

  • Problem with iPhone interoperability and Flash

    - by asksuperuser
    Many Fortune 500 companies have incorporated flash in their websites. What solution to port them to all Mobiles as I hear Apple refuses to get flash ? So what solution does one have ? Redevelop everything at awfull cost using different apis for different mobiles ? Why does iphone refuses to incorporate flash ? What's the purpose of such restriction ?

    Read the article

  • php download file slows

    - by hobbywebsite
    OK first off thanks for your time I wish I could give more than one point for this question. Problem: I have some music files on my site (.mp3) and I am using a php file to increment a database to count the number of downloads and to point to the file to download. For some reason this method starts at 350kb/s then slowly drops to 5kb/s which then the file says it will take 11hrs to complete. BUT if I go directly to the .mp3 file my browser brings up a player and then I can right click and "save as" which works fine complete download in 3mins. (Yes both during the same time for those that are thinking it's my connection or ISP and its not my server either.) So the only thing that I've been playing around with recently is the php.ini and the .htcaccess files. So without further ado, the php file, php.ini, and the .htcaccess: download.php <?php include("config.php"); include("opendb.php"); $filename = 'song_name'; $filedl = $filename . '.mp3'; $query = "UPDATE songs SET song_download=song_download+1 WHER song_linkname='$filename'"; mysql_query($query); header('Content-Disposition: attachment; filename='.basename($filedl)); header('Content-type: audio/mp3'); header('Content-Length: ' . filesize($filedl)); readfile('/music/' . $filename . '/' . $filedl); include("closedb.php"); ?> php.ini register_globals = off allow_url_fopen = off expose_php = Off max_input_time = 60 variables_order = "EGPCS" extension_dir = ./ upload_tmp_dir = /tmp precision = 12 SMTP = relay-hosting.secureserver.net url_rewriter.tags = "a=href,area=href,frame=src,input=src,form=,fieldset=" ; Defines the default timezone used by the date functions date.timezone = "America/Los_Angeles" .htaccess Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_HOST} !^(www.MindCollar.com)?$ [NC] RewriteRule (.*) http://www.MindCollar.com/$1 [R=301,L] <IfModule mod_rewrite.c> RewriteEngine On ErrorDocument 404 /errors/404.php ErrorDocument 403 /errors/403.php ErrorDocument 500 /errors/500.php </IfModule> Options -Indexes Options +FollowSymlinks <Files .htaccess> deny from all </Files> thanks for you time

    Read the article

  • uploading problem

    - by Syom
    i can't upload large files, but i've set php.ini max_execution_time = 3600 max_input_time = 600 memory_limit = 100M post_max_size = 100M file_uploads = On upload_max_filesize = 100M it returns a 500 error - [an error occurred while processing this directive] could you say why? thanks

    Read the article

  • MooTools: Fade out element?

    - by JamesBrownIsDead
    I have an Element object that I'm currently calling .hide() on. Instead, I'd like to fade out the opacity of the entire Element (and its children) to 100% (hidden) as a transition effect over maybe 500 ms or 1000 ms. Can Fx.Tween be used for this? Is this possible--does the MooTools framework have an effect like this in its UI library?

    Read the article

  • Edit axes to an image in matlab ?

    - by ZaZu
    Hello, I would like to edit the axes in my series of images being displayed .. This is how my image looks like : As you can see, it starts from 0 to ~~500 from top to bottom, can I invert that ? Plus I want to mirror the image being shown, so that it starts from left to right ... Or if its possible, let the axes show from right to left ? Thanks !

    Read the article

  • Query.fetch(limit=2000) only moves cursor forward by 1000 entities?

    - by Liron
    Let's say I have 2500 MyModel entities in my datastore, and I run this code: query = MyModel.all() first_batch = query.fetch(2000) len(first_batch) # 2000 next_query = MyModel.all().with_cursor(query.cursor()) next_batch = next_query.fetch(2000) What do you think len(next_batch) is? 500, right? Nope - it's 1500. Apparently the query cursor never moves forward by more than 1000, even when the query itself returns more than 1000 entities. Should I do something different or is it just an App Engine bug?

    Read the article

  • how to remove .php from a certain file (with apache .htaccess)

    - by user2015253
    I want a certain file (only this!) to remove the php extension for calling it. BUT: It uses get parameters and they should remain! I tried it with my .htaccess and something like: RewriteEngine on followed by either RewriteRule ^file(.*)$ file.php$1 or RewriteRule ^file(.+)$ file.php$1 but it doesn't work. (First gives error 500, second gives 404) Example what I want to call in the browser: file?param=asd&foo=bar - It should call as: file.php?param=asd&foo=bar

    Read the article

  • javascript problem in IE8

    - by Pankaj
    This code is not working on IE8 window.open(url, "find_users", "resizable=yes,scrollbars=yes,menubar=no,toolbar=no,location=no,status=yes,height=300,width=500"); I am getting Object Expected error in only IE8, its working fine in all other brouser.

    Read the article

  • PHP running as a FastCGI application (php-cgi) - how to issue concurrent requests?

    - by Sbm007
    Some background information: I'm writing my own webserver in Java and a couple of days ago I asked on SO how exactly Apache interfaces with PHP, so I can implement PHP support. I learnt that FastCGI is the best approach (since mod_php is not an option). So I have looked at the FastCGI protocol specification and have managed to write a working FastCGI wrapper for my server. I have tested phpinfo() and it works, in fact all PHP functions seem to work just fine (posting data, sessions, date/time, etc etc). My webserver is able to serve requests concurrently (ie user1 can retrieve file1.html at the same time as user2 requesting some_large_binary_file.zip), it does this by spawning a new Java thread for each user request (terminating when completed or user connection with client is cancelled). However, it cannot deal with 2 (or more) FastCGI requests at the same time. What it does is, it queues them up, so when request 1 is completed immediately thereafter it starts processing request 2. I tested this with 2 PHP pages, one contains sleep(10) and the other phpinfo(). How would I go about dealing with multiple requests as I know it can be done (PHP under IIS runs as FastCGI and it can deal with multiple requests just fine). Some more info: I am coding under windows and my batch file used to execute php-cgi.exe contains: set PHP_FCGI_CHILDREN=8 set PHP_FCGI_MAX_REQUESTS=500 php-cgi.exe -b 9000 But it does not spawn 8 children, the service simply terminates after 500 requests. I have done research and from Wikipedia: Processing of multiple requests simultaneously is achieved either by using a single connection with internal multiplexing (ie. multiple requests over a single connection) and/or by using multiple connections Now clearly the multiple connections isn't working for me, as everytime a client requests something that involves FastCGI it creates a new socket to the FastCGI application, but it does not work concurrently (it queues them up instead). I know that internal multiplexing of FastCGI requests under the same connection is accomplished by issuing each unique FastCGI request with a different request ID. (also see the last 3 paragraphs of 'The Communication Protocol' heading in this article). I have not tested this, but how would I go about implementing that? I take it I need some kind of FastCGI Java thread which contains a Map of some sort and a static function which I can use to add requests to. Then in the Thread's run() function it would have a while loop and for every cycle it would check whether the Map contains new requests, if so it would assign them a request ID and write them to the FastCGI stream. And then wait for input etc etc, As you can see this becomes too complicated. Does anyone know the correct way of doing this? Or any thoughts at all? Thanks very much. Note, if required I can supply the code for my FastCGI wrapper.

    Read the article

  • Are all css font-weight property's values useful?

    - by jitendra
    font-weight:normal font-weight:bold font-weight:bolder font-weight:lighter font-weight:100 font-weight:200 font-weight:300 font-weight:400 font-weight:500 font-weight:600 font-weight:700 font-weight:800 font-weight:900 I've only used font-weight:bold and sometime font-weight:normal to override bold so far? What are usages of others?

    Read the article

  • MATLAB : frequency distribution

    - by Arkapravo
    I have raw observations of 500 numeric values (ranging from 1 to 25000) in a text file, I wish to make a frequency distribution in MATLAB. I did try the histogram (hist), however I would prefer a frequency distribution curve than blocks and bars. Any help is appreciated !

    Read the article

  • What is GC holes?

    - by tianyi
    I wrote a long TCP connection socket server in C#. Spike in memory in my server happens. I used dotNet Memory Profiler(a tool) to detect where the memory leaks. Memory Profiler indicates the private heap is huge, and the memory is something like below(the number is not real,what I want to show is the GC0 and GC2's Holes are very very huge, the data size is normal): Managed heaps - 1,500,000KB Normal heap - 1400,000KB Generation #0 - 600,000KB Data - 100,000KB "Holes" - 500,000KB Generation #1 - xxKB Data - 0KB "Holes" - xKB Generation #2 - xxxxxxxxxxxxxKB Data - 100,000KB "Holes" - 700,000KB Large heap - 131072KB Large heap - 83KB Overhead/unused - 130989KB Overhead - 0KB Howerver, what is GC hole? I read an article about the hole: http://kaushalp.blogspot.com/2007/04/what-is-gc-hole-and-how-to-create-gc.html The author said : The code snippet below is the simplest way to introduce a GC hole into the system. //OBJECTREF is a typedef for Object*. { PointerTable *pTBL = o_pObjectClass->GetPointerTable(); OBJECTREF aObj = AllocateObjectMemory(pTBL); OBJECTREF bObj = AllocateObjectMemory(pTBL); //WRONG!!! “aObj” may point to garbage if the second //“AllocateObjectMemory” triggered a GC. DoSomething (aOb, bObj); } All it does is allocate two managed objects, and then does something with them both. This code compiles fine, and if you run simple pre-checkin tests, it will probably “work.” But this code will crash eventually. Why? If the second call to “AllocateObjectMemory” triggers a GC, that GC discards the object instance you just assigned to “aObj”. This code, like all C++ code inside the CLR, is compiled by a non-managed compiler and the GC cannot know that “aObj” holds a root reference to an object you want kept live. ======================================================================== I can't understand what he explained. Does the sample mean aObj becomes a wild pointer after GC? Is it mean { aObj = (*aObj)malloc(sizeof(object)); free(aObj); function(aObj);? } ? I hope somebody can explain it.

    Read the article

  • Combining FileStream and MemoryStream to avoid disk accesses/paging while receiving gigabytes of data?

    - by w128
    I'm receiving a file as a stream of byte[] data packets (total size isn't known in advance) that I need to store somewhere before processing it immediately after it's been received (I can't do the processing on the fly). Total received file size can vary from as small as 10 KB to over 4 GB. One option for storing the received data is to use a MemoryStream, i.e. a sequence of MemoryStream.Write(bufferReceived, 0, count) calls to store the received packets. This is very simple, but obviously will result in out of memory exception for large files. An alternative option is to use a FileStream, i.e. FileStream.Write(bufferReceived, 0, count). This way, no out of memory exceptions will occur, but what I'm unsure about is bad performance due to disk writes (which I don't want to occur as long as plenty of memory is still available) - I'd like to avoid disk access as much as possible, but I don't know of a way to control this. I did some testing and most of the time, there seems to be little performance difference between say 10 000 consecutive calls of MemoryStream.Write() vs FileStream.Write(), but a lot seems to depend on buffer size and the total amount of data in question (i.e the number of writes). Obviously, MemoryStream size reallocation is also a factor. Does it make sense to use a combination of MemoryStream and FileStream, i.e. write to memory stream by default, but once the total amount of data received is over e.g. 500 MB, write it to FileStream; then, read in chunks from both streams for processing the received data (first process 500 MB from the MemoryStream, dispose it, then read from FileStream)? Another solution is to use a custom memory stream implementation that doesn't require continuous address space for internal array allocation (i.e. a linked list of memory streams); this way, at least on 64-bit environments, out of memory exceptions should no longer be an issue. Con: extra work, more room for mistakes. So how do FileStream vs MemoryStream read/writes behave in terms of disk access and memory caching, i.e. data size/performance balance. I would expect that as long as enough RAM is available, FileStream would internally read/write from memory (cache) anyway, and virtual memory would take care of the rest. But I don't know how often FileStream will explicitly access a disk when being written to. Any help would be appreciated.

    Read the article

  • Cache headers in Rails

    - by Dimitar Vouldjeff
    Hi, I`m trying to add cache headers on my static files (.css, .js), but only way I found is with some .htaccess stuff that make the page to throw 500 error. So my question is whether there is easier way to add those headers? Thanks in advance.

    Read the article

  • How do I detect server status in a port scanner java implementation

    - by akz
    I am writing a port scanner in Java and I want to be able to distinct the following 4 use cases: port is open port is open and server banner was read port is closed server is not live I have the following code: InetAddress address = InetAddress.getByName("google.com"); int[] ports = new int[]{21, 22, 23, 80, 443}; for (int i = 0; i < ports.length; i++) { int port = ports[i]; Socket socket = null; try { socket = new Socket(address, port); socket.setSoTimeout(500); System.out.println("port " + port + " open"); BufferedReader reader = new BufferedReader( new InputStreamReader(socket.getInputStream())); String line = reader.readLine(); if (line != null) { System.out.println(line); } socket.close(); } catch (SocketTimeoutException ex) { // port was open but nothing was read from input stream ex.printStackTrace(); } catch (ConnectException ex) { // port is closed ex.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } finally { if (socket != null && !socket.isClosed()) { try { socket.close(); } catch (Exception e) { e.printStackTrace(); } } } } The problem is that I get a ConnectionException both when the port is closed and the server cannot be reached but with a different exception message: java.net.ConnectException: Connection timed out: connect when the connection was never established and java.net.ConnectException: Connection refused: connect when the port was closed so I cannot make the distinction between the two use cases without digging into the actual exception message. Same thing happens when I try a different approach for the socket creation. If I use: socket = new Socket(); socket.setSoTimeout(500); socket.connect(new InetSocketAddress(address, port), 1000); I have the same problem but with the SocketTimeoutException instead. I get a java.net.SocketTimeoutException: Read timed out if port was open but there was no banner to be read and java.net.SocketTimeoutException: connect timed out if server is not live or port is closed. Any ideas? Thanks in advance!

    Read the article

  • Loading Java applet from WEB-INF/classes by JSP

    - by Bruce
    Hi guys, Ive got a problem with loading an applet from WEB-INF/classes directory. The main class of an applet (MainApplet.class) is there in the package aaa, but when loading I got the exception java.lang.ClassNotFoundException. Where am I wrong? My jsp is in Web Pages dir. < jsp:plugin type="applet" code="aaa/MainApplet.class" jreversion="1.6" width="700" height="500" Thanks in advance for the reply!

    Read the article

  • How to test custom handler500?

    - by Gr1N
    I write my handler for server errors and define it at root urls.py: handler500 = 'myhandler' And I want to write unittest for testing how it works. For testing I write view with error and define it in test URLs configuration, when I make request to this view in browser I see my handler and receive status code 500, but when I launch test that make request to this view I see stack trace and my test failed. Have you some ideas for testing handler500 by unittests?

    Read the article

  • Can MySQL reasonably perform queries on billions of rows?

    - by haxney
    I am planning on storing scans from a mass spectrometer in a MySQL database and would like to know whether storing and analyzing this amount of data is remotely feasible. I know performance varies wildly depending on the environment, but I'm looking for the rough order of magnitude: will queries take 5 days or 5 milliseconds? Input format Each input file contains a single run of the spectrometer; each run is comprised of a set of scans, and each scan has an ordered array of datapoints. There is a bit of metadata, but the majority of the file is comprised of arrays 32- or 64-bit ints or floats. Host system |----------------+-------------------------------| | OS | Windows 2008 64-bit | | MySQL version | 5.5.24 (x86_64) | | CPU | 2x Xeon E5420 (8 cores total) | | RAM | 8GB | | SSD filesystem | 500 GiB | | HDD RAID | 12 TiB | |----------------+-------------------------------| There are some other services running on the server using negligible processor time. File statistics |------------------+--------------| | number of files | ~16,000 | | total size | 1.3 TiB | | min size | 0 bytes | | max size | 12 GiB | | mean | 800 MiB | | median | 500 MiB | | total datapoints | ~200 billion | |------------------+--------------| The total number of datapoints is a very rough estimate. Proposed schema I'm planning on doing things "right" (i.e. normalizing the data like crazy) and so would have a runs table, a spectra table with a foreign key to runs, and a datapoints table with a foreign key to spectra. The 200 Billion datapoint question I am going to be analyzing across multiple spectra and possibly even multiple runs, resulting in queries which could touch millions of rows. Assuming I index everything properly (which is a topic for another question) and am not trying to shuffle hundreds of MiB across the network, is it remotely plausible for MySQL to handle this? UPDATE: additional info The scan data will be coming from files in the XML-based mzML format. The meat of this format is in the <binaryDataArrayList> elements where the data is stored. Each scan produces = 2 <binaryDataArray> elements which, taken together, form a 2-dimensional (or more) array of the form [[123.456, 234.567, ...], ...]. These data are write-once, so update performance and transaction safety are not concerns. My naïve plan for a database schema is: runs table | column name | type | |-------------+-------------| | id | PRIMARY KEY | | start_time | TIMESTAMP | | name | VARCHAR | |-------------+-------------| spectra table | column name | type | |----------------+-------------| | id | PRIMARY KEY | | name | VARCHAR | | index | INT | | spectrum_type | INT | | representation | INT | | run_id | FOREIGN KEY | |----------------+-------------| datapoints table | column name | type | |-------------+-------------| | id | PRIMARY KEY | | spectrum_id | FOREIGN KEY | | mz | DOUBLE | | num_counts | DOUBLE | | index | INT | |-------------+-------------| Is this reasonable?

    Read the article

  • HTML5 validator.w3.org

    - by danixd
    My site was valid until today, wondering wether it is my site's or the validator's fault. I am getting this message: The error encountered was: 500 Can't connect to localhost:8888 (connect: Connection refused)

    Read the article

  • standard geographic tilizing/binning method?

    - by monkut
    I'm trying to learn and understand more about mapping and displaying values on a map. (GIS) At the moment I'M looking to take some values and apply those values to a tile or bin on a map. Ideally I'd like the tile sizes to be uniform, like 100 meters, 500 meters, etc. Is there a standard method for creating uniform tile sizes? Or Are what are common accepted method to deal with this kind of data display? (Currently I'm using geodjango and it's related toolset geos, proj4, etc)

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >