Search Results

Search found 21063 results on 843 pages for 'stochastic process'.

Page 571/843 | < Previous Page | 567 568 569 570 571 572 573 574 575 576 577 578  | Next Page >

  • XMLHttpRequest cross site scripting on same server but differnt port

    - by clamp
    hello, using XMLHttpRequest it is not possible to open a connection to a document on a different domain than where the page itself is hosted. but what about different ports? for example i have a webserver running on my machine listening on port 80 so the webaddress would look like this: http://localhost:80/mypage.html and i have another webserver running on localhost which is meant to process the ajax requests but listens on a different port. so the javascript in mypage.html would look like this: var xmlhttprequest = new XMLHttpRequest(); xmlhttp.open("GET", "http://localhost:1234/?parameters", true); xmlhttp.send(); would this work? or will it give a security exception as well?

    Read the article

  • Keep an ASP.NET IIS website responsive when time between visits is long

    - by Abel
    After years of ASP.NET development I'm actually quite surprised that I can't seem to find a satisfying solution for this. Why does an IIS ASP.NET site always seem to fall asleep (for 2-6 seconds) after a certain time of inactivity (after several hours), during which no HTTP response is sent from server to client. This happens on any type of site, one page or many, db or not, regardless the settings. How can I fix this? During the wait time, the server is not busy and there are no high peaks or (.NET) memory shortages. My guess is, it has to do with Windows moving the IIS process to the background and its memory to the page file, but I'm not sure. Anybody any idea? EDIT: one solution is to send some HTTP request once an hour or so, but I hope for something more constructive. EDIT: what I meant is: after hours of inactivity, it pauses several seconds on any new HTTP request.

    Read the article

  • [grails] attaching multiple files to a domain class

    - by Emyr
    I've seen various Grails plugins which allow easier handling of file uploads, however these tend only to support a single file per form-submit. I'd like a multi-attach form where as soon as you pick one file, an extra field and button is added using JS (various sites do it like this). Do you know of any good plugins which provide elegant uploading of multiple files without excessive coding? A progress bar either per-file of for the whole process would also be very nice. I don't know to what extent I can allow GORM to handle a java.io.File field (or in this case a Collection<File>).

    Read the article

  • Extracting Information from Images

    - by Khorkrak
    What are some fast and somewhat reliable ways to extract information about images? I've been tinkering with openCV and this seems so far to be the best route plus it has Python bindings. So to be more specific I'd like to determine what I can about what's in an image. So for example the haar face detection and full body detection classifiers are great - now I can tell that most likely there are faces and / or people in the image as well as about how many. okay - what else - how about whether there are any buildings and if so what do they seem to be - huts, office buildings etc? Is there sky visible, grass, trees and so forth. From what I've read about training classifiers to detect objects, it seems like a rather laborious process 10,000 or so wrong images and 5,000 or so correct samples to train a classifier. I'm hoping that there are some decent ones around already instead of having to do this all myself for a bunch of different objects - or is there some other way to go about this sort of thing?

    Read the article

  • HttpURLConnection getting locked

    - by Nayn
    Hi, I have a thread running under tomcat which creates a HttpUrlConnection and reads it through BufferedInputStream. After fetching data for some urls, it stalls. I got the jstack of the process which says HttpUrlConnection is locked and BufferedInputStream is also locked. "http-8080-1" daemon prio=10 tid=0x08683400 nid=0x79c9 runnable [0x8f618000] java.lang.Thread.State: RUNNABLE at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.read(SocketInputStream.java:129) at java.io.BufferedInputStream.fill(BufferedInputStream.java:218) at java.io.BufferedInputStream.read1(BufferedInputStream.java:258) at java.io.BufferedInputStream.read(BufferedInputStream.java:317) - locked <0x956ef8c0> (a java.io.BufferedInputStream) at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:687) at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:632) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1072) - locked <0x956ef910> (a sun.net.www.protocol.http.HttpURLConnection) Could somebody help here. Thanks

    Read the article

  • How well does Solr scale over large number of facet values?

    - by Continuation
    I'm using Solr and I want to facet over a field "group". Since "group" is created by users, potentially there can be a huge number of values for "group". Would Solr be able to handle a use case like this? Or is Solr not really appropriate for facet fields with a large number of values? I understand that I can set facet.limit to restrict the number of values returned for a facet field. Would this help in my case? Say there are 100,000 matching values for "group" in a search, if I set facet.limit to 50. would that speed up the query, or would the query still be slow because Solr still needs to process and sort through all the facet values and return the top 50 ones? Any tips on how to tune Solr for large number of facet values? Thanks.

    Read the article

  • Create SOAP message from WSDL using axiom

    - by code-gijoe
    Hi, I'm a starting a project which consist in sending a request to a web-service (which is already available) and parsing the response. I have the WSDL and URL endpoints. Does anyone have a startup tutorial on how to build something from there? I would like to use Axis2 + Axiom to send the service request and receive and process the response. I'm using eclipse as dev env. I've been search for a tut on how to do this but with no success. Any suggestion would be greatly appreciated!

    Read the article

  • Accessing HttpRequest from Global.asax via a page

    - by Polymorphix
    I'm trying to get a property (ImpersonatePersonId) from a page in global.asax but get a HttpException saying 'Request is not available in this context'. I've been searching for some documentation on where in the pipeline the request is accessible, but as far as I can see all Microsoft can produce of documentation is one-liners like "PostRequestHandlerExecute: Occurs when the ASP.NET event handler finishes execution." which really doesn't give me much... I've tried placing the call in both Pre and PostRequestHandlerExceute but with the same result. I wonder if anyone with experience in this would be so kind as to tell me where the Request object is available. My code from global asax is below. ICanImpersonate page = HttpContext.Current.Handler as ICanImpersonate; ImpersonatedUser impersonatePerson = page != null ? page.ImpersonatePersonId : null; Response.Filter = new TagRewriter(Response, new TagProcessor(Context, impersonatePerson).Process); What I want to do is rewrite some HTML based on some request parameters.

    Read the article

  • Git to SVN trouble

    - by Kevin
    My boss has a Perforce repository for which he wants to make a read-only copy available on Sourceforge via subversion. He had a perl script which would do this but it's no longer functioning (we don't want to try debugging it yet) and it's really not that great anyway. So an alternate solution is to pull the perforce repo into git as a remote ref, which I have already done successfully (including all the proper commit details and authors), now the trouble I'm having is pushing it out to a separate SVN repository. I can make it start the commit process with "git svn dcommit --add-author-from", but the problem is even though the correct author appears at the end of the commit message the "real" author committing is my machine's user. I want to preserve the real author with the commit, and I'd also like to preserve the original timestamps as well. Is anyone familiar with how I could accomplish this?

    Read the article

  • C++ Mock/Test boost::asio::io_stream - based Asynch Handler

    - by rbellamy
    I've recently returned to C/C++ after years of C#. During those years I've found the value of Mocking and Unit testing. Finding resources for Mocks and Units tests in C# is trivial. WRT Mocking, not so much with C++. I would like some guidance on what others do to mock and test Asynch io_service handlers with boost. For instance, in C# I would use a MemoryStream to mock an IO.Stream, and am assuming this is the path I should take here. C++ Mock/Test best practices boost::asio::io_service Mock/Test best practices C++ Async Handler Mock/Test best practices I've started the process with googlemock and googletest.

    Read the article

  • Database file is inexplicably locked during SQLite commit

    - by sweeney
    Hello, I'm performing a large number of INSERTS to a SQLite database. I'm using just one thread. I batch the writes to improve performance and have a bit of security in case of a crash. Basically I cache up a bunch of data in memory and then when I deem appropriate, I loop over all of that data and perform the INSERTS. The code for this is shown below: public void Commit() { using (SQLiteConnection conn = new SQLiteConnection(this.connString)) { conn.Open(); using (SQLiteTransaction trans = conn.BeginTransaction()) { using (SQLiteCommand command = conn.CreateCommand()) { command.CommandText = "INSERT OR IGNORE INTO [MY_TABLE] (col1, col2) VALUES (?,?)"; command.Parameters.Add(this.col1Param); command.Parameters.Add(this.col2Param); foreach (Data o in this.dataTemp) { this.col1Param.Value = o.Col1Prop; this. col2Param.Value = o.Col2Prop; command.ExecuteNonQuery(); } } this.TryHandleCommit(trans); } conn.Close(); } } I now employ the following gimmick to get the thing to eventually work: private void TryHandleCommit(SQLiteTransaction trans) { try { trans.Commit(); } catch (Exception e) { Console.WriteLine("Trying again..."); this.TryHandleCommit(trans); } } I create my DB like so: public DataBase(String path) { //build connection string SQLiteConnectionStringBuilder connString = new SQLiteConnectionStringBuilder(); connString.DataSource = path; connString.Version = 3; connString.DefaultTimeout = 5; connString.JournalMode = SQLiteJournalModeEnum.Persist; connString.UseUTF16Encoding = true; using (connection = new SQLiteConnection(connString.ToString())) { //check for existence of db FileInfo f = new FileInfo(path); if (!f.Exists) //build new blank db { SQLiteConnection.CreateFile(path); connection.Open(); using (SQLiteTransaction trans = connection.BeginTransaction()) { using (SQLiteCommand command = connection.CreateCommand()) { command.CommandText = DataBase.CREATE_MATCHES; command.ExecuteNonQuery(); command.CommandText = DataBase.CREATE_STRING_DATA; command.ExecuteNonQuery(); //TODO add logging } trans.Commit(); } connection.Close(); } } } I then export the connection string and use it to obtain new connections in different parts of the program. At seemingly random intervals, though at far too great a rate to ignore or otherwise workaround this problem, I get unhandled SQLiteException: Database file is locked. This occurs when I attempt to commit the transaction. No errors seem to occur prior to then. This does not always happen. Sometimes the whole thing runs without a hitch. No reads are being performed on these files before the commits finish. I have the very latest SQLite binary. I'm compiling for .NET 2.0. I'm using VS 2008. The db is a local file. All of this activity is encapsulated within one thread / process. Virus protection is off (though I think that was only relevant if you were connecting over a network?). As per Scotsman's post I have implemented the following changes: Journal Mode set to Persist DB files stored in C:\Docs + Settings\ApplicationData via System.Windows.Forms.Application.AppData windows call No inner exception Witnessed on two distinct machines (albeit very similar hardware and software) Have been running Process Monitor - no extraneous processes are attaching themselves to the DB files - the problem is definitely in my code... Does anyone have any idea whats going on here? I know I just dropped a whole mess of code, but I've been trying to figure this out for way too long. My thanks to anyone who makes it to the end of this question! brian UPDATES: Thanks for the suggestions so far! I've implemented many of the suggested changes. I feel that we are getting closer to the answer...however... The code above technically works however it is non-deterministic! It is not guaranteed to do anything aside from spin in neutral forever. In practice it seems to work somewhere between the 1st and 10th iteration. If i batch my commits at a reasonable interval damage will be mitigated but I really do not want to leave things in this state... More suggestions welcome!

    Read the article

  • Setting application affinity in gdb

    - by Marcus Ahlberg
    Is there a simple way of setting the affinity of the application I'm debugging without locking gdb to the same core? The reason why I'm asking is that the application is running with real time priority and it needs to run on a single core. At the moment I use this command line taskset -c 3 gdbserver :1234 ./app.out but the application stops responding and freezes the gdb server, making debugging impossible. I suspect that the real time priority of the application prevents gdb from executing. If I start the application and then start gdb without affinity setting, then I can attach and debug the application without gdb freezing. Is there a simple way to start gdb and the application with different affinities? Or preferably: Is there a gdb command to set affinity of the child process?

    Read the article

  • Umbrella websites with Microsites & Blogs - recommendations

    - by pingu
    Hi guys, I'm in the process of scoping a solution for an events organisation. Their main website features information about them and blog entries etc, but they hold major events which require microsites (domain.com/event). The microsites all have a different look-and-feel, different nav structure, and custom content-managed components. The one thing that's common across everything is the blog - it will have a category for each event under which users can post. How would you guys implement this? The solution needs to be PHP, and my initial thought was CodeIgniter but if possible I'd like to avoid building the blog functionality - so I guess I could integrate it with EE or Wordpress. If anyone has any other suggestions they would be most appreciated.

    Read the article

  • Dealing with a badly formatted CSV file

    - by Josh K
    I have an exceptionally bad CSV file. Although I "solved" the problem in the end by manually writing scripts to process and reprocess this specific file I wanted to know if there were any other solutions out there. You have a CSV file that has all the fields terminated by | (pipe) characters. Running a quick check shows you that there are 53 fields in the file. The person who gave you the file claims there there are only 28 fields. Not all of the fields have information in them. For example there are five custom_field_{num} fields which may or may not have data. How would you get this into a database nicely? The ideal solution (and one I searched high and low for) would be to just throw it all into a table with no column names or specifications. Then remove any columns that were completely blank and then give them titles and specifications.

    Read the article

  • Reading from a file not line-by-line

    - by MadH
    Assigning a QTextStream to a QFile and reading it line-by-line is easy and works fine, but I wonder if the performance can be inreased by first storing the file in memory and then processing it line-by-line. Using FileMon from sysinternals, I've encountered that the file is read in chunks of 16KB and since the files I've to process are not that big (~2MB, but many!), loading them into memory would be a nice thing to try. Any ideas how can I do so? QFile is inhereted from QIODevice, which allows me to ReadAll() it into QByteArray, but how to proceed then and divide it into lines?

    Read the article

  • When to override OnError?

    - by Ek0nomik
    I'm looking into re-working and simplifying our error handling in an application I support. We currently have all of our pages inheriting from a base class we created, which in turn obviously inherits from System.Web.UI.Page. Within this base class, the OnError method is currently being overridden, and in turn calling MyBase.OnError, and then calling one of our custom logging methods. I don't see any benefit of overriding the OnError method, and I think it would be better to let the Application_Error method in the Global.asax take care of the unhandled exception (logging it) and then the customErrors section in the config would trigger a process to redirect the user. Looking online it looks like people override this method quite frequently, but I don't see a need to and this article from MSDN makes me think the same.

    Read the article

  • Xcode 3.2 says the version of iPhone OS isn't supported when my iPhone OS is version 3.1.3

    - by fr0man
    I just went through the whole certificate/keychain/provisioning/appID/profile/DNA test process to get my app running on my iPhone. Turns out my iPhone OS was out of date (3.0.1 I think), so I updated it. Now it says The version of iPhone OS on “Stefanie's phone” does not match any of the versions of iPhone OS supported for development with this copy of Xcode. Please restore the device to a version of the OS listed below. If necessary, the latest version of Xcode is available here. OS Installed on Stefanie's phone 3.1.3 (7E18) Xcode Supported iPhone OS Versions 3.1.2 (7D11) 3.1.1 (7C146) 3.1.1 (7C145) 3.1 (7C144) 3.0.1 (7A400) 3.0 2.2.1 2.2 2.1.1 2.1 2.0.2 (5C1) 2.0.1 (5B108) 2.0 (5A347) But I have Xcode 3.2.1, which is supposed to support iPhone OS 3.1.3. What am I doing wrong? I have a cruddy internet connection (HughesNet), so I can't upgrade the Xcode SDK without it taking literally days.

    Read the article

  • Passing a 'var' into another method

    - by Danny
    Hi, I am probably totally missing the point here but.... How can I pass a 'var' into another method? (I am using linq to load XML into an Enumerable list of objects). I have differernt object types (with different fields), but the final step of my process is the same regardless of which object is being used. XNamespace xmlns = ScehmaName; var result = from e in XElement.Load(XMLDocumentLocation).Elements(xmlns + ElementName) select new Object1 { Field1 = (string)e.Element(xmlns + "field1"), Field2 = (string)e.Element(xmlns + "field2") }; var result2 = Enumerable.Distinct(result); This code will vary for the different type of XML files that will be processed. But I then want to iterate though the code to check for various issues: foreach (var w in result2) { if (w.CheckIT()) { //do something } } What I would love is the final step to be in a method in the base class, and I can pass the 'var' varible into it from each child class.

    Read the article

  • Django Asynchronous Processing

    - by freyrs
    Hello all, I have a bunch of Django requests which executes some mathematical computations ( written in C and executed via a Cython module ) which may take an indeterminate amount ( on the order of 1 second ) of time to execute. Also the requests don't need to access the database. Right now everything is synchronous ( using Gunicorn with sync worker types ) but I'd like to make this asynchronous and nonblocking. I am very new to asynchronous Django, and so my question is what is the best stack for doing this. Is this sort of process something a task queue is well suited for? Would anyone recommend Tornado + Celery + RabbitMQ, or perhaps something else? Thanks in advance!

    Read the article

  • Can someone describe some DI terms to me?

    - by SoBeNoFear
    I'm in the process of writing a DI framework for PHP 5, and I've been trying to find the 'official' definitions of some words in relation to dependency injection. Some of these words are 'context' and 'lifecycle'. And also, what would I call the object that gets created/injected? Finally, what is the difference between components and services, and which term (if either) should I call the objects that can be injected? I've read Martin Fowler's article and looked through other DI frameworks (Phemto, Spring, Google Guice, Xyster, etc.), but I want to know what you think. Thanks!

    Read the article

  • Make Sphinx generate me rst for class documentation from pydoc

    - by Michal Cihar
    I'm currently in process of migrating existing (non complete) documentation to Sphinx. The final goal is to have all documentation in Sphinx. The problem I'm facing right now is that I have some documentation using Python docstrings (well the module is actually written in C, but it probably does not matter) and I would like to generate class documentation in form usable for Sphinx from these docstrings. I know there is sphinx.ext.autodoc, but it automatically puts current docstrings to the document. I rather want to generate source (rst) file based on current docstrings, which I could edit and improve manually. So is there some way to turn existing docstrings into rst form which Sphinx consumes?

    Read the article

  • How can I debug this web service http handler?

    - by baron
    Hello everyone, I am building a Httphandler following these instructions here It manipulates HTTP POST and HTTP GET. I have a client with two buttons two POST and GET. After i've tested and happy everythings working I move it from localhost to IIS. Now when I do this I get an exception in the POST handler code. How on earth can I debug this code line by line? I managed to do this awhile ago, I thought it was by attaching to process but I can't work it out. I can emulate GET just by typing address in browser, post im not sure about. I've tried telnetting and sending it from there but haven't had any luck.

    Read the article

  • Creating standalone, console (shell) for domain-specific operations

    - by mr.b
    Say that I have a system service, and I want to offer a low-level maintenance access to it. For that purpose, I'd like to create a standalone, console application that somehow connects to server process and lets user type in commands, allow it to use auto-completion and auto-suggestion on single/double TAB press (just like linux bash shell, mysql cli, cmd.exe, and countless others), allow command line editing capabilities (history, cursor keys to move around text..), etc. Now, it's not that much of a problem to create something like that by rolling my own from scratch, handling user input, scanning pressed keys, and doing correct actions. But, why reinvent the wheel? Is there some library/framework that helps with this kind of problems, just like readline library that offers improved command-line editing capabilities under linux? Of course, this new "shell" would respond only to valid, domain-specific commands, and would suggest valid arguments, options, switches... Any ideas? Thanks!

    Read the article

  • Dang Error #1009 !

    - by boz
    I'm building a simple flash site for a friend who has a spa. I keep getting this error: Error #1009: Cannot access a property or method of a null object reference. at spa7_fla::MainTimeline/frame1() through the process of commenting out code, i've narrowed down to my link section: vox_link.addEventListener(MouseEvent.CLICK,gotoVox); function gotoVox(evtObj:Event):void { var voxSite:URLRequest=new URLRequest("http://www.voxmundiproject.com"); navigateToURL(voxSite, "_blank"); } With this section commented out, i don't get the 1009 error. When the code is active, I get the error. My code syntax is correct so I'm stumped. Does someone have an idea what my be wrong? Thanks!

    Read the article

  • Beginner's resources/introductions to classification algorithms.

    - by Dirk
    Hi, everybody. I am entirely new to the topic of classification algorithms, and need a few good pointers about where to start some "serious reading". I am right now in the process of finding out, whether machine learning and automated classification algorithms could be a worthwhile thing to add to some application of mine. I already scanned through "How to Solve It: Modern heuristics" by Z. Michalewicz and D. Fogel (in particular, the chapters about linear classifiers using neuronal networks), and on the practical side, I am currently looking through the WEKA toolkit source code. My next (planned) step would be to dive into the realm of Bayesian classification algorithms. Unfortunately, I am lacking a serious theoretical foundation in this area (let alone, having used it in any way as of yet), so any hints at where to look next would be appreciated; in particular, a good introduction of available classification algorithms would be helpful. Being more a craftsman and less a theoretician, the more practical, the better... Hints, anyone?

    Read the article

< Previous Page | 567 568 569 570 571 572 573 574 575 576 577 578  | Next Page >