Search Results

Search found 59218 results on 2369 pages for 'time sheep'.

Page 260/2369 | < Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >

  • Why can I not send more than one request?

    - by Doug
    function stateChanged(idname) { xmlhttp.onreadystatechange=function() { if (xmlhttp.readyState==4 && xmlhttp.status==200) { document.getElementById(idname).value = xmlhttp.responseText; } } } function openSend(php,idname) { stateChanged(idname); xmlhttp.open("GET",php,true); xmlhttp.send(); } function showHint() { if (window.XMLHttpRequest) { xmlhttp=new XMLHttpRequest(); } else { xmlhttp=new ActiveXObject("Microsoft.XMLHTTP"); } openSend("time.php", "Time"); openSend("date1.php", "Date1"); openSend("date2.php", "Date2"); return; } These two say aborted (in Firebug) and doesn't return a value. Why is that? Is it because I can't send more than 1 request? openSend("time.php", "Time"); openSend("date1.php", "Date1"); If I can't, how could I achieve 3 requests with only one invocation?

    Read the article

  • How can I filter a date of a DateTimeField in Django?

    - by Xidobix
    I am trying to filter a DateTimeField comparing with a date. I mean: MyObject.objects.filter(datetime_attr=datetime.date(2009,8,22)) I get an empty queryset list as an answer because (I think) I am not considering time, but I want "any time". Is there an easy way in Django for doing this? * I have the time in the datetime setted, it is not 00:00.

    Read the article

  • SQL Group by Minute- Expanded

    - by Barnie
    I am working on something similar to this post here: TS SQL - group by minute However mine is pulling from an message queue, and I need to see an accurate count of the amount of traffic the Message Queue is creating/ sending, and at what time Select * From MessageQueue mq My expanded version of this though is the following: A) User defines a start time and an end time (Easy enough using Declare's @StartTime and @EndTime B) Give the user the option of choosing the "grouping". Will it be broken out by 1 minutes, 5 minutes, 15 minutes, or 30 minutes (Max). (I had thought to do this with a CASE statement, but my test problems fall apart on me.) C) Display the data to accurately show a count of what happened during the interval (Grouping) selected. This is where I am at so far SQL Blob: DECLARE @StartTime datetime DECLARE @EndTime datetime SELECT DATEPART(n, mq.cre_date)/5 as Time --Trying to just sort by 5 minute intervals ,CONVERT(VARCHAR(10),mq.Cre_Date,101) ,COUNT(*) as results FROM dbo.MessageQueue mq WHERE mq.cre_date BETWEEN @StartDate AND @EndDate GROUP BY DATEPART(n, mq.cre_date)/5 --Trying to just sort by 5 minute intervals , eq.Cre_Date This is the output I would like to achieve: [Time] [Date] [Message Count] 1300 06/26/2012 5 1305 06/26/2012 1 1310 06/26/2012 100

    Read the article

  • How to make the eclipse IDE to build faster

    - by Solitaire
    Hi all.. i am using Eclipse IDE for developmental purpose, IDE is taking too much time to build, it gets hangs up, when the percentage of Build reaches to 78. it shows refreshing workspace several times.. it eats up lots of time.. please tell me how to make it to disable the unwanted "refreshing workspace" and other time consuming activities, and make the build faster. Thanks

    Read the article

  • Grep for 2 words after pattern found

    - by Dileep Ch
    The scenario is i have a file and contains a string "the date and time is 2012-12-07 17:11:50" I had searched and found a command grep 'the date and time is' 2012-12-07.txt | cut -d\ -f5 it just displays the 5th word and i need the combination of 5th and 6th, so i tried grep 'the date and time is' 2012-12-07.txt | cut -d\ -f5 -f6 But its error. Now, how to grep the 5th and 6th word with one command I just need the output like 2012-12-07 17:11:50

    Read the article

  • Io exception: There is no process to read data written to a pipe.

    - by Srikanth
    I'm using Hibernate3.2+Websphere6.0+struts1.3.. After deploying ,application works fine. After some idle time ,i will get this type of error repeatedly,am not able to login at all. Im not using any connection pooling. i feel after idle time its not able to connect to the database again..if i restart the server everything works fine for some time...after that same story.. please help me out

    Read the article

  • ASP NET forms Authorization: how to reduce duration?

    - by eddo
    I've got a web page which is implementing cookie based ASPNET Forms Authentication. Once the user has logged in the page, he can edit some information using a form which is created using a partialview and returned to him as a dialog for editing. The action linked to the partial view is decorated as follows: [HttpGet] [OutputCache(Duration = 0, VaryByParam = "None")] [Authorize(Roles = "test")] public ActionResult changeTripInfo(int tripID, bool ovride=false) { ... } The problem i am experiencing is the latency between the request and the time when the dialog is shown to the user: time ranges between 800 and 1100 ms which is not justified by the complexity of the form. Investigating with Glimpse turns out that the time to process the AuthorizeAttribute (see snip) sums up to at least 650 ms which is troubling me. Looking at the Sql server log, the call which checks the user roles takes, as expected, virtually nothing (duration 0). How can I reduce this time? Am I missing some optimization?

    Read the article

  • How to deal with large result sets with Linq to Entities?

    - by user169867
    I have a fairly complex linq to entities query that I display on a website. It uses paging so I never pull down more than 50 records at a time for display. But I also want to give the user the option to export the full results to Excel or some other file format. My concern is that there could potentially be a large # of records all being loaded into memory at one time to do this. Is there a way to process a linq result set 1 record at a time like you could w/ a datareader so only 1 record is really being kept in memory at a time? I'd appreciate any help. Thanks

    Read the article

  • How to change file contents in PHP ?

    - by Misha Moroshko
    I save in a file some info about users (like number of times user passed the login page, last visited time, and so on). I want to read this info from the file, and update it (add 1 to the counter, and change the last visited time). My question is: can I do it without opening the file twice ? I open the first time to read the contents, and then open it again to overwrite the contents with the updated ones. Thanks !

    Read the article

  • How to get locale using Php.

    - by Ramakrishnan
    How to get Locale using Php,I want to get the time zone based on the location, is there any possibility to get in php. I need to convert it to GMT and save Database.Again i need to Pull it back to UI with same time same time zone.

    Read the article

  • Extremely Difficult Problem with ASP.Net 4.0 WebForms app using Routing

    - by dudeNumber4
    I have a completed app running in a QA environment. Everything works fine under most circumstances. If you hit a plain URL (no identifying information in the URL), you see an intro page with a button (generated by an asp LinkButton control) that posts back and directs you to another page. The markup looks the same when it fails and when it doesn't. When such a URL is followed from, e.g., Word and the default browser is IE, the intro page loads fine, but clicking the button causes an error. When not debugging, this behavior occurs every time. While debugging, the error occurs only ~ 1 in 10 times (closing the browser instance and starting over every time). When the error occurs, the intro page Page_Load fires and IsPostBack is false. Somehow, instead of a post, a get is being issued. When I run fiddler to try to analyze the actual calls (can't use firebug because it never happens using Firefox), everything works every time. I don't know whether this issue has anything to do with routing, and I've no idea even what to look at next. The strange thing is, when I debug, the intro page doesn't fully load every time. Only about 1 in 3 times does it fully load even if I've just cleared browser cache. When I run it through fiddler, it fully loads and works fine every time.

    Read the article

  • What's the best/easiest way to compare two times in Objective-C?

    - by Andy
    I've got a string representation of a time, like "11:13 AM." This was produced using an NSDateFormatter and the stringFromDate: method. I'd like to compare this time to the current time, but when I use the dateFromString: method to turn the string back into a date, a year, month and day are added - which I don't want. I just need to know if right now is < or the time stored in the string. What's going to be the best way to handle that? Thanks in advance for your help.

    Read the article

  • Suggestions for opening the Rails toolbox to design a challenge game?

    - by keruilin
    How would you suggest designing a challenge system as part of a food-eating game so that it's automated as possible? All RoR tools, design patterns and logic are at your disposal (e.g., admin consoles, crontab, arch, etc.). Prize goes to whoever can suggest the simplest and most-automated design! Here are the requirements: User has many challenges. Badge has many challenges. (A unique badge is awarded for each challenge won.) Only one challenge can run at a time. Each challenge has a limited number of days that it runs. For example, one challenge can run 3 days, while another runs 7 days. Challenges can be seasonal. For example, "Eat 13 Pumpkins" only runs during the Fall. New challenges are added to the game on an ongoing basis. For example, a new challenge every week. Each challenge has a certain probability of being selected to run. For example, "Eat 10 Pies" challenge has 10% chance of being selected to run. As each new challenge is added to the database, I want the probabilities of running to change dynamically. I want to avoid the scenario where I'm manually updating a database field just to change the probability from 10% to 5%, for example. Challenges act like Easter eggs. Challenge icons pop-up at different places on the webpage. User is awarded a badge for successfully completing a challenge, but only when it's active. There is some wait time between each challenge. Between 1 and 7 days. Which wait time is random, but the probability of the wait time being short is high and the probability of it being a long wait time is low.

    Read the article

  • Preventing dictionary attacks on a web application

    - by Kevin Pang
    What's the best way to prevent a dictionary attack? I've thought up several implementations but they all seem to have some flaw in them: Lock out a user after X failed login attempts. Problem: easy to turn into a denial of service attack, locking out many users in a short amount of time. Incrementally increase response time per failed login attempt on a username. Problem: dictionary attacks might use the same password but different usernames. Incrementally increase response time per failed login attempt from an IP address. Problem: easy to get around by spoofing IP address. Incrementally increase response time per failed login attempt within a session. Problem: easy to get around by creating a dictionary attack that fires up a new session on each attempt.

    Read the article

  • Understanding omission failure in distributed systems

    - by karthik A
    The following text says this which I'm not able to quite agree : client C sends a request R to server S. The time taken by a communication link to transport R over the link is D. P is the maximum time needed by S to recieve , process and reply to R. If omission failure is assumed ; then if no reply to R is received within 2(D+P) , then C will never recieve a reply to R . Why is the time here 2(D+P). As I understand shouldn't it be 2D+P ?

    Read the article

  • Making alarm clock with NSTimer

    - by Alex G
    I just went through trying to make an alarm clock app with local notifications but that didn't do the trick because I needed an alert to appear instead of it going into notification centre in iOS 5+ So far I've been struggling greatly with nstimer and its functions so I was wondering if anyone could help out. I want when the user selects a time (through UIDatePicker, only time) for an alert to be displayed at exactly this time. I have figured out how to get the time from UIDatePicker but I do not know how to properly set the firing function of nstimer. This is what I have attempted so far, if anyone could help... be much appreciated. Thank you Example (it keeps going into the function every second opposed to a certain time I told it too... not what I want): NSDate *timestamp; NSDateComponents *comps = [[[NSDateComponents alloc] init] autorelease]; [comps setHour:2]; [comps setMinute:8]; timestamp = [[NSCalendar currentCalendar] dateFromComponents:comps]; NSTimer *f = [[NSTimer alloc] initWithFireDate:timestamp interval:0 target:self selector:@selector(test) userInfo:nil repeats:YES]; NSRunLoop *runner = [NSRunLoop currentRunLoop]; [runner addTimer:f forMode: NSDefaultRunLoopMode];

    Read the article

  • Many users, many cpus, no delays. Good for cloud?

    - by Eric
    I wish to set up a CPU-intensive time-important query service for users on the internet. A usage scenario is described below. Is cloud computing the right way to go for such an implementation? If so, what cloud vendor(s) cater to this type of application? I ask specifically, in terms of: 1) pricing 2) latency resulting from: - slow CPUs, instance creations, JIT compiles, etc.. - internal management and communication of processes inside the cloud (e.g. a queuing process and a calculation process) - communication between cloud and end user 3) ease of deployment A usage scenario I am expecting is: - A typical user sends a query (XML of size around 1K) once every 30 seconds on average. - Each query requires a numerical computation of average time 0.2 sec and max time 1 sec on a 1 GHz Pentium. The computation requires no data other than the query itself and is performed by the same piece of code each time. - The delay a user experiences between sending a query and receiving a response should be on average no more than 2 seconds and in general no more than 5 seconds. - A background save to a DB of the response should occur (not time critical) - There can be up to 30000 simultaneous users - i.e., on average 1000 queries a second, each requiring an average 0.2 sec calculation, so that would necessitate around 200 CPUs. Currently I'm look at GAE Java (for quicker deployment and less IT hassle) and EC2 (Speed and price optimization) as options. Where can I learn more about the right way to set ups such a system? past threads, different blogs, books, etc.. BTW, if my terminology is wrong or confusing, please let me know. I'd greatly appreciate any help.

    Read the article

  • python filter can't output

    - by Jesse Siu
    i create filter by python to the log file like Sat Jun 2 03:32:13 2012 [pid 12461] CONNECT: Client "66.249.68.236" Sat Jun 2 03:32:13 2012 [pid 12460] [ftp] OK LOGIN: Client "66.249.68.236", anon password "[email protected]" Sat Jun 2 03:32:14 2012 [pid 12462] [ftp] OK DOWNLOAD: Client "66.249.68.236", "/pub/10.5524/100001_101000/100022/readme.txt", 451 bytes, 1.39Kbyte/sec the script is import time lines=[] f= open("/opt/CLiMB/Storage1/log/vsftp.log") line = f.readline() lines=[line for line in f] def OnlyRecent(line): if time.strptime(line.split("[")[0].strip(),"%a %b %d %H:%M:%S %Y") < time.time()-(60*60*24*2): return True return False print"\n".join(filter(OnlyRecent,lines)) f.close() but when i run this script, it continue running but didn't show anything until i stop it. Why it can't shows records happened in 2 days.

    Read the article

  • Searching for duplicate records within a text file where the duplicate is determined by only two fie

    - by plg
    First, Python Newbie; be patient/kind. Next, once a month I receive a large text file (think 7 Million records) to test for duplicate values. This is catalog information. I get 7 fields, but the two I'm interested in are a supplier code and a full orderable part number. To determine if the record is dupliacted, I compress all special characters from the part number (except . and #) and create a compressed part number. The test for duplicates becomes the supplier code and compressed part number combination. This part is fairly straight forward. Currently, I am just copying the original file with 2 new columns (compressed part and duplicate indicator). If the part is a duplicate, I put a "YES" in the last field. Now that this is done, I want to be able to go back (or better yet, at the same time) to get the previous record where there was a supplier code/compressed part number match. So far, my code looks like this: Compress Full Part to a Compressed Part and Check for Duplicates on Supplier Code and Compressed Part combination import sys import re import time ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ start=time.time() try: file1 = open("C:\Accounting\May Accounting\May.txt", "r") except IOError: print sys.stderr, "Cannot Open Read File" sys.exit(1) try: file2 = open(file1.name[0:len(file1.name)-4] + "_" + "COMPRESSPN.txt", "a") except IOError: print sys.stderr, "Cannot Open Write File" sys.exit(1) hdrList="CIGSUPPLIER|FULL_PART|PART_STATUS|ALIAS_FLAG|ACQUISITION_FLAG|COMPRESSED_PART|DUPLICATE_INDICATOR" file2.write(hdrList+chr(10)) lines_seen=set() affirm="YES" records = file1.readlines() for record in records: fields = record.split(chr(124)) if fields[0]=="CIGSupplier": continue #If incoming file has a header line, skip it file2.write(fields[0]+"|"), #Supplier Code file2.write(fields[1]+"|"), #Full_Part file2.write(fields[2]+"|"), #Part Status file2.write(fields[3]+"|"), #Alias Flag file2.write(re.sub("[$\r\n]", "", fields[4])+"|"), #Acquisition Flag file2.write(re.sub("[^0-9a-zA-Z.#]", "", fields[1])+"|"), #Compressed_Part dupechk=fields[0]+"|"+re.sub("[^0-9a-zA-Z.#]", "", fields[1]) if dupechk not in lines_seen: file2.write(chr(10)) lines_seen.add(dupechk) else: file2.write(affirm+chr(10)) print "it took", time.time() - start, "seconds." ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ file2.close() file1.close() It runs in less than 6 minutes, so I am happy with this part, even if it is not elegant. Right now, when I get my results, I import the results into Access and do a self join to locate the duplicates. Loading/querying/exporting results in Access a file this size takes around an hour, so I would like to be able to export the matched duplicates to another text file or an Excel file. Confusing enough? Thanks.

    Read the article

  • Convert a form_tag select_datetime to SQL datetime

    - by Mitchell
    Hi I am trying to make a simple search form that uses a startTime and endTime to specify a time range. The db has a datetime field time that is compared against. So far when i try to use params[:startTime] in the controller I get an array of values which wont work with :conditions = ['time < ?', params[:endTime]] Is there a simple solution to parse the form's datetime to SQL datetime?

    Read the article

  • Why can't I rename a data frame column inside a list?

    - by Moreno Garcia
    I would like to rename some columns from CPU_Usage to the process name before I merge the dataframes in order to make it more legible. names(byProcess[[1]]) # [1] "Time" "CPU_Usage" names(byProcess[1]) # [1] "CcmExec_3344" names(byProcess[[1]][2]) <- names(byProcess[1]) names(byProcess[[1]][2]) # [1] "CPU_Usage" names(byProcess[[1]][2]) <- 'test' names(byProcess[[1]][2]) # [1] "CPU_Usage" lapply(byProcess, names) # $CcmExec_3344 # [1] "Time" "CPU_Usage" # # ... (removed several entries to make it more readable) # # $wrapper_1604 # [1] "Time" "CPU_Usage"

    Read the article

  • Profiling and Graphics.updateDisplay0()

    - by AD
    I am profiling my BlackBerry application (a game) with the options 'In method only' and 'Time including native method'. Under the 'All methods' node (after a decent run of the game) the single most time consuming function is Graphics.updateDisplay0() with over 50% of the run time spent there. I don't quite know what is the purpose of that function or if I can do anything to reduce its processing time, so if anyone can shed some light on the matter, I would be grateful. Because from what I understand with the 'In method only' option, it means there is some processing within that function. ps. there is no information on that function in the documentation.

    Read the article

  • Setting timeout for embedded Lua

    - by skyeagle
    I have embedded Lua in a C/C+= application. I want to be able to set a timeout value to prevent getting trapped with badly written scripts that can result in infinite loops (or even string searches that take an infinite time to complete). Basically, I want to be able to set a time interval and if the script fails to complete running at the end of that time interval, I want to be able to kill the Lua script engine (gracefully, if possible). Anyone knows of best practise way to do this?

    Read the article

  • How do I find the install directory of a Windows Service, using C#?

    - by endian
    I'm pretty sure that a Windows service gets C:\winnt (or similar) as its working directory when installed using InstallUtil.exe. Is there any way I can access, or otherwise capture (at install time), the directory from which the service was originally installed? At the moment I'm manually entering that into the app.exe.config file, but that's horribly manual and feels like a hack. Is there a programmatic way, either at run time or install time, to determine where the service was installed from?

    Read the article

< Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >