Search Results

Search found 7077 results on 284 pages for 'concurrent processing'.

Page 194/284 | < Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >

  • Web Search API which allows automated queries

    - by Spi1988
    I need to develop a java desktop application which sends queries to a search engine in order to obtain the very first highest ranked pages (Example: the first 4 pages only). Some heavy processing needs to be performed on the retrieved pages, so the time between a query and another won't be less then a minute. I would like to know whether there is any web search API for java, suitable for my situation, i.e. which allows the use of automated queries? (since in my case, the queries are generated programatically, and not through user interaction) I have checked Google's AJAX Search API and also Yahoo's Search Boss, however they only allow queries triggered by direct user interaction.

    Read the article

  • How to stop the execution of Java program from Command line?

    - by Aakash
    My main field is .Net but recently I have got something to do with Java. I have to create a shell utility in Java that could run in background reading few database records after specified duration and do further processing. It's a kind of scheduler. Now I have few concerns: How to make this work as a service. I want to execute it through a shell script and the utility should start running. Off course the control should get back to the calling script. Secondly, eventually i may want to stop this process from running. How to achieve this? I understand these are basic question but I really have no idea where to begin and what options are best for me. Any help / advise please?

    Read the article

  • SSI not producing output, not giving error either.

    - by bstullkid
    in the html file: <!--#exec cgi="/cgi-bin/test.pl"--> the perl script: #!/usr/bin/perl print "Content-Type: text/html\n\n"; print "<input type=\"hidden\" name=\"aname\" value=\"avalue\">\n"; print "<img src=\"/cgi-bin/script.pl\" />"; This does not give me an 'error processing directive' error, nor does it output my HTML inplace of the tag. I'll also add that the ssi tag gets replaced with nothing.

    Read the article

  • regex and javascript, some matches disappear !

    - by dader51
    Here is the code : > var reg = new RegExp(" hel.lo ", 'g'); > > var str = " helalo helblo helclo heldlo "; > > var mat = str.match(reg); > > alert(mat); It alerts "helalo, helclo", but i expect it to be "helalo, helblo, helclo, heldlo" . Only the half of them matches, I guess that's because of the space wich count only once. So I tried to double every space before processing, but in some case it's not enough. I'm looking for an explanation, and a solution. Thx

    Read the article

  • java.awt.HeadlessException thrown from HeadlessGraphicsEnvironment.getDefaultScreenDevice

    - by Omry
    I need to do some image processing on a java server (Debian with java version "1.6.0_12"), and I am receiving java.awt.HeadlessException from my code: java.awt.HeadlessException at sun.java2d.HeadlessGraphicsEnvironment.getDefaultScreenDevice(HeadlessGraphicsEnvironment.java:64) at WaxOn.getDefaultConfiguration(WaxOn.java:341) Even when java.awt.headless is set to true (as evident by this code printing so): if (!java.awt.GraphicsEnvironment.isHeadless()) { logger.warn("Headless mode is not enabled"); } else { logger.info("Headless mode"); } This is the code that throws the exception: public static GraphicsConfiguration getDefaultConfiguration() { GraphicsEnvironment ge = GraphicsEnvironment.getLocalGraphicsEnvironment(); GraphicsDevice gd = ge.getDefaultScreenDevice(); return gd.getDefaultConfiguration(); } Any idea how to solve this?

    Read the article

  • How to query a CGI based webserver from an app written in MFC (MSVC 2008) and process the result?

    - by shan23
    Hi, I am exploring the option of querying a web-page, which has a CGI script running on its end, with a search string (say in the form of http://sw.mycompany.com/~tools/cgi-bin/script.cgi?param1=value1&param2=value2&param3=value3 ), and displaying the result on my app (after due processing of course). My app is written in MFC C++ , and I must confess that I have never attempted anything related to network programming before. Is what I'm trying to do very infeasible ? If not, could anyone point me at the resources I need to look at in order to go about this ? Thanks !

    Read the article

  • PowerShell: Read text, regex sort, write output to file and formatting

    - by Bill Hunter
    I am a Powershell novice and have run into a challenge in reading, sorting, and outputting a csv file. The input csv has no headers, the data is as follows: 05/25/2010,18:48:33,Stop,a1usak,10.128.212.212 05/25/2010,18:48:36,Start,q2uhal,10.136.198.231 05/25/2010,18:48:09,Stop,s0upxb,10.136.198.231 I use the following piping construct to read the file, sort and output to a file: (Get-Content d:\vpnData\u62gvpn2.csv) | %{,[regex]::Split($, ",")} | sort @{Expression={$[3]}},@{Expression={$_[1]}} | out-file d:\vpnData\u62gvpn3.csv The new file is written with the following format: 05/25/2010 07:41:57 Stop a0uaar 10.128.196.160 05/25/2010 12:24:24 Start a0uaar 10.136.199.51 05/25/2010 20:00:56 Stop a0uaar 10.136.199.51 What I would like to see in the output file is a similar format to the original input file with comma dilimiters: 05/25/2010,07:41:57,Stop,a0uaar,10.128.196.160 05/25/2010,12:24:24,Start,a0uaar,10.136.199.51 05/25/2010,20:00:56,Stop,a0uaar,10.136.199.51 But I can't quite seem to get there. I'm almost of the mind that I'll have to write another segment to read the newly produced file and reset its contents to the preferred format for further processing. Thoughts?

    Read the article

  • Server Emulator Design Pattern

    - by adisembiring
    I wanna build server socket emulator, but I want implement some design pattern there. I will described my case study that I have simplified like these: My Server Socket will always listen client socket. While some request message come from the client socket, the server emulator will response the client through the socket. the response is response code. '00' will describe request message processed successfully, and another response code expect '00' will describe there are some error while processing the message request. IN the server there are some UI, this UI contain check response parameter such as. response code timeout interval While the server want to response the client message, the response code taken from input parameter response form UI check the timeout interval, it will create sleep thread and the interval taken from timeout interval input from UI. I have implement the function, but I create it in one class. I feel it so sucks. Can you suggest me what class / interface that I must create to refactor my code.

    Read the article

  • Automate download of BusinessObjecs Web Intelligence reports

    - by Daren Thomas
    I'm tasked with automating the retrieval of a couple of BusinessObjects Web Intelligence reports and further processing thereof. I have no other means of access to this data (this was the first avenue I followed), so I will have to do some screen scraping. Alas, the interface seems user-only. Grr! Has anyone done this before? Like to share? Also, does anyone know of a good library for automating the web browser? I know there is a python thingy out there that can be used for testing web applications - I need something in .NET though... What is your favorite? PS: I have also checked this thread (automate getting report from webpage), but am hoping for a Web Intelligence specific sollution.

    Read the article

  • How can I get a rails server to use the same databse that cucumber uses during a test?

    - by James
    The cucumber test first makes an entry in the database and posts a form to a second server. This second server does some processing in background and then hits the first app (where the test is being run) with some data that the cucumber test needs to know about. I've tried running the main server via script/server and script/server -e test while the cucumber test is running, but I can't seem to force the server to use the same database that cucumber is using when it runs its step definitions. That is, when the second server pushes some data to a controller in the main server, the main server doesn't know about any entries that cucumber has made in the database. How can I get cucumber and the main server to use the same database?

    Read the article

  • Parallel.Foreach loop creating multiple db connections throws connection errors?

    - by shawn.mek
    Login failed. The login is from an untrusted domain and cannot be used with Windows authentication I wanted to get my code running in parallel, so I changed my foreach loop to a parallel foreach loop. It seemed simple enough. Each loop connects to the database, looks up some stuff, performs some logic, adds some stuff, closes the connection. But I get the above error? I'm using my local sql server and entity framework (each loop uses it's own context). Is there some problem with connecting multiple times using the same local login or something? How did I get around this? I have (before trying to covert to a parallel.foreach loop) split my list of objects that I am foreach looping through into four groups (separate csv files) and run four concurrent instances of my program (which ran faster overall than just one, thus the idea for parallel). So it seems connecting to the db shouldn't be a problem? Any ideas? EDIT: Here's before var gtgGenerator = new CustomGtgGenerator(); var connectionString = ConfigurationManager.ConnectionStrings["BioEntities"].ConnectionString; var allAccessionsFromObs = _GetAccessionListFromDataFiles(collectionId); ForEach(cloneIdAndAccessions in allAccessionsFromObs) DoWork(gtgGenerator, taxonId, organismId, cloneIdAndAccessions, connectionString)); after var gtgGenerator = new CustomGtgGenerator(); var connectionString = ConfigurationManager.ConnectionStrings["BioEntities"].ConnectionString; var allAccessionsFromObs = _GetAccessionListFromDataFiles(collectionId); Parallel.ForEach(allAccessionsFromObs, cloneIdAndAccessions => DoWork(gtgGenerator, taxonId, organismId, cloneIdAndAccessions, connectionString)); Inside the DoWork I use the BioEntities using (var bioEntities = new BioEntities(connectionString)) {...}

    Read the article

  • Wait for tasks to get completed in threadpool.

    - by Alien01
    Hello I have created a thread pool in C++ which stores all tasks in a queue. Thread pool start n number of threads which takes tasks from queue , process each task and then delete tasks from queue. Now , I want to wait till all tasks get completed. Checking for empty queue for completion of tasks may not work as , task can be given to each thread and queue can be emptied but still the tasks can in processing mode. I am not getting idea how to wait for all the tasks completion.This is a design problem. Any suggestions?

    Read the article

  • How to merge/crosslink Javadoc?

    - by Tom Wheeler
    If you have the standard Javadoc for a few different projects, how can you process them to create a single unified set of documentation in which everything is cross-linked? Ideally, the result would be similar to the documentation for the various modules in the NetBeans Platform: http://bits.netbeans.org/dev/javadoc/index.html but I've looked at their build scripts and they're predicated on you building everything from source. I'm looking for something which could also handle linking in Javadoc for third-party libraries, so I'd imagine it would need to be a post-processing operation. I can't be the first person to ever want this. Any ideas?

    Read the article

  • glReadPixels and save to image

    - by Julius Petraška
    I have app, where user drags and drops image, and it is being redrawn with OpenGL for some aviable processing. Everything works. And when user wants to save his image it works like that: glReadPixels -> NSBitmapImageRep -> NSData -> Write to file This works too. Almost. With some images it is not working as it should work. For example: .png when I open and save this image: I get: And if I open and save this image: I get: .jpg If I open and save: I get: And when I open and save: I get: So sometimes images saves badly. Why is it happening?

    Read the article

  • MYSQL and the LIMIT clause

    - by Lizard
    I was wondering if adding a LIMIT 1 to a query would speed up the processing? For example... I have a query that will most of the time return 1 result, but will occasionaly return 10's, 100's or even 1000's of records. But I will only ever want the first record. Would the limit 1 speed things up or make no difference? I know I could use GROUP BY to return 1 result but that would just add more computation. Any thoughts gladly accepted! Thanks

    Read the article

  • Algorithm to match natural text in mail

    - by snøreven
    I need to separate natural, coherent text/sentences in emails from lists, signatures, greetings and so on before further processing. example: Hi tom, last monday we did bla bla, lore Lorem ipsum dolor sit amet, consectetur adipisici elit, sed eiusmod tempor incidunt ut labore et dolore magna aliqua. list item 2 list item 3 list item 3 Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquid x ea commodi consequat. Quis aute iure reprehenderit in voluptate velit regards, K. ---line-of-funny-characters-####### example inc. 33 evil street, london mobile: 00 234534/234345 Ideally the algorithm would match only the bold parts. Is there any recommended approach - or are there even existing algorithms for that problem? Should I try approximate regular expressions or more statistical stuff based on number of punctation marks, length and so on?

    Read the article

  • The whole site is blocked while one page is waiting for blocking operation (PHP, ASP.NET). Why?

    - by artvolk
    Good day! I've found interesting behaviour for both LAMP stack and ASP.NET. The scenario: There is page performing task in 2-3 minutes (making HttpWebRequest for ASP.NET and curl for PHP). While this page is processed all other requests to this virtual host from the same browser are not processed (even if I use different browsers from one machine). I use two pages written in PHP and C#. I've tested with Apache+PHP in both mod_php and fast_cgi modes on Windows and Debian. For ASP.NET I use IIS6 (with dedicated app pool for this site and with default app pool) and IIS7 in integrated mode. I know that it is better to use async calls for such things, but I'm just curious why single page blocks the entire site and not only the thread processing the request? Thanks in advance!

    Read the article

  • Virtual and Physical Memory / OutOfMemoryException

    - by user417518
    Hi, I am working on a 64-bit .Net Windows Service application that essentially loads up a bunch of data for processing. While performing data volume testing, we were able to overwhelm the process and it threw an OutOfMemoryException (I do not have any performance statistics on the process when it failed.) I have a hard time believing that the process requested a chunk of memory that would have exceeded the allowable address space for the process since its running on a 64-bit machine. I do know that the process is running on a machine that is consistently in the neighborhood of 80%-90% physical memory usage. My question is: Can the CLR throw an OutOfMemoryException if the machine is critically low on available physical memory even though the process wouldn't exceed it's allowable amount of virtual memory? Thanks for your help!

    Read the article

  • Node & Redis: Crucial Design Issues in Production Mode

    - by Ali
    This question is a hybrid one, being both technical and system design related. I'm developing the backend of an application that will handle approx. 4K request per second. We are using Node.js being super fast and in terms of our database struction we are using MongoDB, with Redis being a layer between Node and MongoDB handling volatile operations. I'm quite stressed because we are expecting concurrent requests that we need to handle carefully and we are quite close to launch. However I do not believe I've applied the correct approach on redis. I have a class Student, and they constantly change stages(such as 'active', 'doing homework','in lesson' etc. Thus I created a Redis DB for each state. (1 for being 'active', 2 for being 'doing homework'). Above I have the structure of the 'active' students table; xa5p - JSON stringified object #1 pQrW - JSON stringified object #2 active_student_table - {{studentId:'xa5p'}, {studentId:'pQrW'}} Since there is no 'select all keys method' in Redis, I've been suggested to use a set such that when I run command 'smembers' I receive the keys and later on do 'get' for each id in order to find a specific user (lets say that age older than 15). I've been also suggested that in fact I used never use keys in production mode. My question is, no matter how 'conceptual' it is, what specific things I should avoid doing in Node & Redis in production stage?. Are they any issues related to my design? Students must be objects and I sure can list them in a list but I haven't done yet. Is it that crucial in production stage?

    Read the article

  • asp.net Configuration Error on host

    - by zey
    I uploaded my asp.net site to hosting site , and my site browse correctly . But when I go to login url , it's show me the error Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS. Source Error: Line 23: </assemblies> Line 24: </compilation> Line 25: <authentication mode="Forms"> Line 26: <forms loginUrl="~/Account/Login.aspx" timeout="2880" /> Line 27: </authentication> How can I fix it ?

    Read the article

  • JQuery make an array - how/what is best

    - by russp
    I have 4 serailized arrays that I want to pass to php for processing. What is the best way to combine them into a single array example: serial_1 = $('#col1').sortable('serialize'); serial_2 = $('#col2').sortable('serialize'); serial_3 = $('#col3').sortable('serialize'); serial_4 = $('#col4').sortable('serialize');` each serialized array relates to a column/section of the page (col1,col2 etc.) What I need to do/would like to do is create a single array that puts the serialized array inside another array for a single post. example: var new_array = serilaize(col_1(serial_1),col2(serial_2),col3,(serial_3),col4(serial_4)) I KNOW THAT IS NOT RIGHT as I have no idea in JQuery how to right the correct syntax. This new array is to be posted via ajax like this: $.ajax({ url: "test.php", type: "post", data: new_array, error: function(){ alert('SOME ERROR MESSAGE'); } }); Thanks in advance

    Read the article

  • A good approach to db planing for reporting service

    - by Itay Moav
    The scenario: Big system (~200 tables). 60,000 users. Complex reports that will require me to do multiple queries for each report and even those will be complex queries with inner queries all over the place + some processing in PHP. I have seen an approach, which I am not sure about: Having one centralized, de-normalized, table that registers any activity in the system which is reportable. This table will hold mostly foreign keys, so she should be fairly compact and fast. So, for example (My system is a virtual learning management system), A user enrolls to course, the table stores the user id, date, course id, organization id, activity type (enrollment). Of course I also store this data in a normalized DB, which the actual application uses. Pros I see: easy, maintainable queries and code to process data and fast retrieval. Cons: there is a danger of the de-normalized table to be out of sync with the real DB. Is this approach worth considering, or (preferably from experience) is total $#%#%t?

    Read the article

  • Using delayed_job to process file uploads across multiple servers

    - by Steve Klabnik
    Does anyone have any good resources on how to do this? Basically, I'm working on a project (in Rails) where people can upload files. They might be big. I'd like to process them using delayed_job before sending them to S3. I'd also like to do this processing on a separate job queue server, rather than on the webserver itself. I'd rather not have to upload the files to the webserver, then transfer them to the job queue server, and then upload them to S3 if I don't have to. Thanks.

    Read the article

  • fetch only first row from oracle - is it faster?

    - by john
    My main goal with this question is optimization and faster run time. After doing lot of processing in the Stored Proc I finally return a count like below: OPEN cv_1 FOR SELECT COUNT(*) num_of_members FROM HOUSEHOLD_MEMBER a, HOUSEHOLD b WHERE RTRIM(LTRIM(a.mbr_last_name)) LIKE v_MBR_LAST_NAME || '%' AND a.number = '01' AND a.code = v_CODE AND a.ssn_head = v_SSN_HEAD AND TO_CHAR( a.mbr_dob, 'MM/DD/YYYY') = v_DOB; But in my code that is calling the SP does not need the actual count. It just cares that count is greater than 1. Question: How can I change this to return just 1 or 0. 1 when count is 0 and 0 when count 1. Will it be faster to do this rather than returning the whole count?

    Read the article

  • Should XML be used server-side, and JSON client-side?

    - by Michel Carroll
    As a personal project, I'm making an AJAX chatroom application using XML as a server-side storage, and JSON for client-side processing. Here's how it works: AJAX Request gets sent to PHP using GET (chat messages/logins/logouts) PHP fetches/modifies the XML file on the server PHP encodes the XML into JSON, and sends back JSON response Javascript handles JSON information (chat messages/logins/logouts) I want to eventually make this a larger-scale chatroom application. Therefore, I want to make sure it's fast and efficient. Was this a bad design choice? In this case, is switching between XML and JSON ok, or is there a better way?

    Read the article

< Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >