Search Results

Search found 7077 results on 284 pages for 'concurrent processing'.

Page 189/284 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • Estimate serialization size of objects?

    - by Stefan K.
    In my thesis, I woud like to enhance messaging in a cluster. It's important to log runtime information about how big a message is (should I prefer processing local or remote). I could just find frameoworks about estimating the object memory size based on java instrumentation. I've tested classmexer, which didn't come close to the serialization size and sourceforge SizeOf. In a small testcase, SizeOf was around 10% wrong and 10x faster than serialization. (Still transient breaks the estimation completely and since e.g. ArrayList is transient but is serialized as an Array, it's not easy to patch SizeOf. But I could live with that) On the other hand, 10x faster with 10% error doesn't seem very good. Any ideas how I could do better?

    Read the article

  • How should I handle incomplete packet buffers?

    - by Benjamin Manns
    I am writing a client for a server that typically sends data as strings in 500 or less bytes. However, the data will occasionally exceed that, and a single set of data could contain 200,000 bytes, for all the client knows (on initialization or significant events). However, I would like to not have to have each client running with a 50 MB socket buffer (if it's even possible). Each set of data is delimited by a null \0 character. What kind of structure should I look at for storing partially sent data sets? For example, the server may send ABCDEFGHIJKLMNOPQRSTUV\0WXYZ\0123!\0. I would want to process ABCDEFGHIJKLMNOPQRSTUV, WXYZ, and 123! independently. Also, the server could send ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890LOL123HAHATHISISREALLYLONG without the terminating character. I would want that data set stored somewhere for later appending and processing. Also, I'm using asynchronous socket methods (BeginSend, EndSend, BeginReceive, EndReceive) if that matters.

    Read the article

  • Any tips of how to handle hierarchical trees in relational model?

    - by George
    Hello all. I have a tree structure that can be n-levels deep, without restriction. That means that each node can have another n nodes. What is the best way to retrieve a tree like that without issuing thousands of queries to the database? I looked at a few other models, like flat table model, Preorder Tree Traversal Algorithm, and so. Do you guys have any tips or suggestions of how to implement a efficient tree model? My objective in the real end is to have one or two queries that would spit the whole tree for me. With enough processing i can display the tree in dot net, but that would be in client machine, so, not much of a big deal. Thanks for the attention

    Read the article

  • Do we affect multiple users in ASP.NET when we set the Thread CurrentCulture/CurentUICulture?

    - by Nikolay
    When we set the CurrentCulture and/or CurrentUICulture we do this on the current thread like this: Thread.CurrentThread.CurrentCulture = new CultureInfo("en-GB"); Thread.CurrentThread.CurrentUICulture = new CultureInfo("en-GB"); Doest this mean we could affect the culture settings of multiple users of our web application as their requests may reuse the threads from pool? I am working on an ASP.NET MVC application where each user may have own culture setting specified in his/her account data. When the user logs in, the culture setting is retrieved from the database and has to be set as current culture. My worry is that setting the current culture on the current thread may affect another user request reusing this thread. I am even more concerned reading this: ASP.NET not only uses a thread pool, but may switch threads during request processing.

    Read the article

  • MYSQL : First and last record of a grouped record (aggregate functions)

    - by Jimmy
    I am trying to do fectch the first and the last record of a 'grouped' record. More precisely, I am doing a query like this SELECT MIN(low_price), MAX(high_price), open, close FROM symbols WHERE date BETWEEN(.. ..) GROUP BY YEARWEEK(date) but I'd like to get the first and the last record of the group. It could by done by doing tons of requests but I have a quite large table. Is there a [low processing time if possible] way to do this with MySQL?

    Read the article

  • render HTML (convert to bitmap)

    - by MK
    Can somebody recommend the best (and preferably portable) way to render HTML documents onto a bitmap? As far as I understand my main 2 options are WebKit and Gecko, but I wasn't able to find a good starting point on how to do it. When I last tried doing this 5 years ago, I ended up using Gecko to send the document to a printer, which is not really what I need. I need rendering to a in-memory bitmap. To clarify: server side, no Java, no .NET, batch processing, performance, not interactive, no Javascript.

    Read the article

  • Sending html data via $post fails

    - by Neil
    I am using the code below which is fine but when I use the code below that in an attempt to send an html fragment to a processing page to save it as a file but I get nothing. I have tried using ajax with processData set to false ads dataTypes of html, text and xml but nothing works. I can't find anything on this so I guess I must be missing something fairly trivial but I've been at it for 3 hours now. This works $.post("SaveFile.aspx", {f: "test4.htm", c: "This is a test"}, function(data){ alert(data); }, "text"); This fails $.post("SaveFile.aspx", {f: "test4.htm", c: "<h1>This is a test</h1>"}, function(data){ alert(data); }, "text");

    Read the article

  • Retail knowledge inference

    - by blueomega
    So i am doing a research on how can i infer knowledge from reports (not with a specific format), but after pre processing, i should have some kind of formatted data. A fairly basic inference would be: "Retailer has X stock." and "X is sellable." - "Retailer sells X" the knowledge i focus is retail domain oriented, and if possible i should improve its efficiency with each iteration. Is this scifi(some of my friends think it is)? The related stuff i find online are "expert systems" that find anomalies, fuzzy inference systems and some rants about "easy knowledge". Can you help me with some points for me to focus or orient me in some research directions? blueomega

    Read the article

  • Using multithreading for loop

    - by annelie
    Hello, I'm new to threading and want to do something similar to this question: http://stackoverflow.com/questions/100291/speed-up-loop-using-multithreading-in-c-question However, I'm not sure if that solution is the best one for me as I want them to keep running and never finish. (I'm also using .net 3.5 rather than 2.0 as for that question.) I want to do something like this: foreach (Agent agent in AgentList) { // I want to start a new thread for each of these agent.DoProcessLoop(); } --- public void DoProcessLoop() { while (true) { // do the processing // this is things like check folder for new files, update database // if new files found } } Would a ThreadPool be the best solution or is there something that suits this better? Thanks, Annelie

    Read the article

  • ASP.NET MVC Session usage

    - by Ben
    Currently I am using ViewData or TempData for object persistance in my ASP.NET MVC application. However in a few cases where I am storing objects into ViewData through my base controller class, I am hitting the database on every request (when ViewData["whatever"] == null). It would be good to persist these into something with a longer lifespan, namely session. Similarly in an order processing pipeline, I don't want things like Order to be saved to the database on creation. I would rather populate the object in memory and then when the order gets to a certain state, save it. So it would seem that session is the best place for this? Or would you recommend that in the case of order, to retrieve the order from the database on each request, rather than using session? Thoughts, suggestions appreciated. Thanks Ben

    Read the article

  • Thread-safe queue in Javascript or jQuery

    - by at
    I have many asynchronous AJAX calls whose results will get processed. It doesn't matter what order the processing occurs, but the results need to get processed one at a time. So I'd like to simple do my AJAX calls and they all just put their results in a single queue. That queue should then get processed on a single thread. This way the results get processed one at a time as soon as possible. What's the best way to do this? I'm using jQuery, so happy to take advantage of any facilities it provides for this.

    Read the article

  • How do I watch a file for changes using Python?

    - by Jon Cage
    I have a log file being written by another process which I want to watch for changes. Each time a change occurrs I'd like to read the new data in to do some processing on it. What's the best way to do this? I was hoping there'd be some sort of hook from the PyWin32 library. I've found the win32file.FindNextChangeNotification function but have no idea how to ask it to watch a specific file. If anyone's done anything like this I'd be really grateful to hear how... [Edit] I should have mentioned that I was after a solution that doesn't require polling. [Edit] Curses! It seems this doesn't work over a mapped network drive. I'm guessing windows doesn't 'hear' any updates to the file the way it does on a local disk.

    Read the article

  • How to get the contents of the wav file into array so as to cut the required segment and convert it

    - by kaushik
    How to get the contents of the wav file into array so as to cut the required segment and convert it back to wav format using python?? My prob is similar to "ROMANs" prob,i hav seen earlier in the post at this site.. Basically,i want to combine parts of different wav file into one wav file?? if there is ne other apporach thn takin the contents into an array and cuting part and combining and again converting bac? please suggest... edited: I prefer unpacking the contents of the wave file into an array and editing by cutting the required segment of sound from the wav file,as i am working on speech processing,and guess this way would be easy to enchance the quality of sound later... can ne one suggest a way for this?? Plz help.. Thanks in advance.

    Read the article

  • MySQL Insert Statement Queue

    - by Justin
    We are building an ajax application in which a users input is submitted for processing to a php script. We are currently writing every request to a log file for tracking. I would like to move this tracking into a database table but I do not want to run a insert statement after request. What I would like to do is set up a 'queue' of transactions (inserts and updates) that need to be processed on the MySQL database. I would then set up a cron job or process to check and process the transactions in the queue. Is there something out there that we could build upon or do we have to just write to plain ol' text log files and process them?

    Read the article

  • Kill a Perl system call after a timeout

    - by Fergal
    I've got a Perl script I'm using for running a file processing tool which is started using backticks. The problem is that occasionally the tool hangs and It needs to be killed in order for the rest of the files to be processed. Whats the best way best way to apply a timeout after which the parent script will kill the hung process? At the moment I'm using: foreach $file (@FILES) { $runResult = `mytool $file >> $file.log`; } But when mytool hangs after n seconds I'd like to be able to kill it and continue to the next file.

    Read the article

  • Advice on a DB that can be uploaded to a website by a smart client for collecting survey feedback

    - by absfabs
    Hello, I'm hoping you can help. I'm looking for a zero config multi-user datbase that my winforms application can easily upload to a webserver folder (together with 1 or 2 classic asp pages) and am looking for some suggestions/recommendations. The idea is that the database will be used to collect feedback entered by people filling in the asp pages. The pages will write to the database using javascript. The database will subsequently be downloaded again for processing once the responses are in. In Summary: It will mostly run in MS Windows environments. I have a modest budget for this and do not mind paying for such a database. No runtime licensing costs. Should be xcopy - Once uploaded to a website folder it should be operational. It should not have a dotnet CLR dependency. It should support a resonable level of concurrent access. Average respondent count would be around 20-30 but one never knows. Should be a reasonable size so that uploads/downloads to and from the site will be reasonably fast. Would appreciate your suggestions/comments Many thanks Abz To clarify - this is a desktop commercial application for feedback management in a vertical market. It uses SQL Server as the backing store. The application currently provides feedback management from email and paper feedback. I now want to add web feedback capability. Getting users to to make their SQL servers accessible to a website is not at option at this time as I am want to make getting up and running as painless as possible. I intend to release a web based implementation of the software in the near future but for now am looking at the above as a pragmatic way to provide web based feedback collection.

    Read the article

  • does the order a composite key is defined matter?

    I have a table with (col1,col2) as a composite primary key. create table twokeytable(col1 int,col2 int,constraint twokeytable_pk primary key (col1,col2)); and another table with col3,col4 collumns witha composite foreign key(col3,col4) which references the(col1,col2) primary key. For some processing I need to drop the foreign key and primary constraints .While restoring the constraints does order of the keys matter?. are these same? create table fktwokeytable(col3 int,col4 int,constraint fkaddfaa_fk foreign key(col4,col3) references twokeytable(col1,col2)) and create table fktwokeytable(col3 int,col4 int,constraint fkaddfaa_fk foreign key(col3,col4) references twokeytable(col1,col2))

    Read the article

  • Can you decode a mutable Bitmap from an InputStream?

    - by Daniel Lew
    Right now I've got an Android application that: Downloads an image. Does some pre-processing to that image. Displays the image. The dilemma is that I would like this process to use less memory, so that I can afford to download a higher-resolution image. However, when I download the image now, I use BitmapFactory.decodeStream(), which has the unfortunate side effect of returning an immutable Bitmap. As a result, I'm having to create a copy of the Bitmap before I can start operating on it, which means I have to have 2x the size of the Bitmap's memory allocated (at least for a brief period of time; once the copy is complete I can recycle the original). Is there a way to decode an InputStream into a mutable Bitmap?

    Read the article

  • How to implement buffering with timeout in RX

    - by Gaspar Nagy
    I need to implement an event processing, that is done delayed when there are no new events arriving for a certain period. (I have to queue up a parsing task when the text buffer changed, but I don't want to start the parsing when the user is still typing.) I'm new in RX, but as far as I see, I would need a combination of BufferWithTime and the Timeout methods. I imagine this to be working like this: it buffers the events until they are received regularly within a specified time period between the subsequent events. If there is a gap in the event flow (longer than the timespan) it should return propagate the events buffered so far. Having a look at how Buffer and Timeout is implemented, I could probably implement my BufferWithTimeout method (if everyone have one, please share with me), but I wonder if this can be achieved just by combining the existing methods. Any ideas?

    Read the article

  • paperclip error

    - by ZX12R
    i am trying paperclip for the first time and followed this tutorial all is well until i use styles. this is the code has_attached_file :photo, :url => "/uploads/products/:id/:style/:basename.:extension", :path => ":rails_root/public/uploads/products/:id/:style/:basename.:extension", :styles => { :thumb=> "100x100#" } the error i see on the console is [paperclip] An error was received while processing: #<Paperclip::NotIdentifiedByImageMagickError: C:/DOCUME~1/LOCALS~1/Temp/stream,2956,1.jpg is not recognized by the 'identify' command.> what does this mean? I have no idea what it means. Should i install this ImageMagick? I tried installing it as a plugin as per this page. This also returns an error that "plugin not found". what am i missing here?

    Read the article

  • Purpose of Trigraph sequences in C++?

    - by Kirill V. Lyadvinsky
    According to C++'03 Standard 2.3/1: Before any other processing takes place, each occurrence of one of the following sequences of three characters (“trigraph sequences”) is replaced by the single character indicated in Table 1. ---------------------------------------------------------------------------- | trigraph | replacement | trigraph | replacement | trigraph | replacement | ---------------------------------------------------------------------------- | ??= | # | ??( | [ | ??< | { | | ??/ | \ | ??) | ] | ??> | } | | ??’ | ˆ | ??! | | | ??- | ˜ | ---------------------------------------------------------------------------- In real life that means that code printf( "What??!\n" ); will result in printing What| because ??! is a trigraph sequence that is replaced with the | character. My question is what purpose of using trigraphs? Is there any practical advantage of using trigraphs? UPD: In answers was mentioned that some European keyboards don't have all the punctuation characters, so non-US programmers have to use trigraphs in everyday life? UPD2: Visual Studio 2010 has trigraph support turned off by default.

    Read the article

  • Monolog conversations in SQL Service Broker 2008

    - by hemil
    Hi, I have a scenario in which I need to process(in SQL Server) messages being delivered as .xml files in a folder in real time. I started investigating SQL Service Broker for my queuing needs. Basically, I want the Service Broker to pick up my .xml files and place them in a queue as they arrive in the folder. But, SQL Service Broker does not support "Monolog" conversations, at least not in the current version. It supports only a dialog between an initiator and a target service. I can use MSMQ but then I will have two things to maintain - the .Net Code for file processing in MSMQ and the SQL Server T-SQL stored procs. What options do I have left? Thanks.

    Read the article

  • Newbie Question: Read and Process a List of Text Files

    - by johnv
    I'm completely new to .NET and am trying as a first step to write a text processing program. The task is simple: I have a list of 10,000 text files stored in one folder, and I'm trying to read each one, store it as a string variable, then run it through a series of functions, then save the final output to another folder. So far I can only manage to manually input the file path like this (in VB.NET): Dim tRead As System.IO.StreamReader Public Function ReadFile() As String Dim EntireFile As String tRead = File.OpenText("c:\textexample\00001.txt") EntireFile = tRead.ReadToEnd Return EntireFile End Function Public Function Step1() ..... End Function Public Function Step2() ..... End Function .............. I'm wondering, therefore, if there's a way to automate this process. Perhaps for example store all input file path into a text file then read each entry at a time, then save the final output into the save path, again listed in a text file. Any help is greatly appreciated. ReplyQuote

    Read the article

  • (My)SQL performance: updating one field vs many unneccesary fields

    - by changokun
    i'm processing a form that has a lot of fields for a user who is editing an existing record. the user may have only changed one field, and i would typically do an update query that sets the values of all the fields, even though most of them don't change. i could do some sort of tracking to see which fields have actually changed, and only update the few that did. is there a performance difference between updating all fields in a record vs only the one that changed? are there other reasons to go with either method? the shotgun method is pretty easy...

    Read the article

  • How do I set default values on new properties for existing entities after light weight core data migration?

    - by Moritz
    I've successfully completed light weight migration on my core data model. My custom entity Vehicle received a new property 'tirePressure' which is an optional property of type double with the default value 0.00. When 'old' Vehicles are fetched from the store (Vehicles that were created before the migration took place) the value for their 'tirePressure' property is nil. (Is that expected behavior?) So I thought: "No problem, I'll just do this in the Vehicle class:" - (void)awakeFromFetch { [super awakeFromFetch]; if (nil == self.tirePressure) { [self willChangeValueForKey:@"tirePressure"]; self.tirePressure = [NSNumber numberWithDouble:0.0]; [self didChangeValueForKey:@"tirePressure"]; } } Since "change processing is explicitly disabled around" awakeFromFetch I thought the calls to willChangeValueForKey and didChangeValueForKey would mark 'tirePresure' as dirty. But they don't. Every time these Vehicles are fetched from the store 'tirePressure' continues to be nil despite having saved the context.

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >