Search Results

Search found 22569 results on 903 pages for 'win32 process'.

Page 802/903 | < Previous Page | 798 799 800 801 802 803 804 805 806 807 808 809  | Next Page >

  • Visual C++ function suddenly 170x slower

    - by Mikael
    For the past few months I've been working on a Visual C++ project to take images from cameras and process them. Up until today this has taken about 65 ms to update the data but now it has suddenly increased significantly. What happens is: I launch my program and for the first 30 or so iterations it performs as expected, then suddenly the loop time increases from 65 ms to 250 ms. The odd thing is, after timing each function I found out that the part of the code which is causing the slowdown is fairly basic and has not been modified in over a month. The data which goes into it is unchanged and identical every iteration but the execution time which is initially less than 1 ms suddenly increases to 170 ms while the rest of the code is still performing as expected (time-wise). Basically, I am calling the same function over and over, for the first 30 calls it performs as it should, after that it slows down for no apparent reason. It might also be worth noting that it is a sudden change in execution time, not a gradual increase. What could be causing this? The code is leaking some memory (~50 kb/s) but not nearly enough to warrant a sudden 4x slowdown. If anyone has any ideas I'd love to hear them!

    Read the article

  • POST to a webpage from C# app

    - by markiyanm
    I've been looking/asking around and can't seem to figure this one out. I have a C# application and need to be able to gather some data in the app, pop open a web browser and POST some data to it. I can POST to the site from within the app fine and I can obviously pop open IE to a certain link but I can't do both. I can't POST to that link directly. Any ideas on how to accomplish this? private void btnSubmit_Click(object sender, EventArgs e) { ASCIIEncoding encoding = new ASCIIEncoding(); string postData = "Fullname=Test"; byte[] data = encoding.GetBytes(postData); // Prepare web request... HttpWebRequest myRequest = (HttpWebRequest)WebRequest.Create("http://www.url.com/Default.aspx"); myRequest.Method = "POST"; myRequest.ContentType = "application/x-www-form-urlencoded"; myRequest.ContentLength = data.Length; Stream newStream = myRequest.GetRequestStream(); // Send the data. newStream.Write(data, 0, data.Length); System.Diagnostics.Process.Start(myRequest.Address.ToString()); //open browser newStream.Close(); } Any insight would be greatly appreciated. Thanks

    Read the article

  • Calling member functions dynamically

    - by user652511
    I'm pretty sure it's possible to call a class and its member function dynamically in Delphi, but I can't quite seem to make it work. What am I missing? // Here's a list of classes (some code removed for clarity) moClassList : TList; moClassList.Add( TClassA ); moClassList.Add( TClassB ); // Here is where I want to call an object's member function if the // object's class is in the list: for i := 0 to moClassList.Count - 1 do if oObject is TClass(moClassList[i]) then with oObject as TClass(moClassList[i]) do Foo(); I get an undeclared identifier for Foo() at compile. Clarification/Additional Information: What I'm trying to accomplish is to create a Change Notification system between business classes. Class A registers to be notified of changes in Class B, and the system stores a mapping of Class A - Class B. Then, when a Class B object changes, the system will call a A.Foo() to process the change. I'd like the notification system to not require any hard-coded classes if possible. There will always be a Foo() for any class that registers for notification. Maybe this can't be done or there's a completely different and better approach to my problem. By the way, this is not exactly an "Observer" design pattern because it's not dealing with objects in memory. Managing changes between related persistent data seems like a standard problem to be solved, but I've not found very much discussion about it. Again, any assistance would be greatly appreciated. Jeff

    Read the article

  • What's the best way to link formal specs to JIRA enchancement requests?

    - by Adam
    What's the best way to link formal specs to JIRA enhancement requests? I want to track changes to specifications using JIRA. Ideally, I'd like to refer to a functional ID reference in a JIRA ticket (e.g. MYAPPAPPROVAL LOGICMAIN SCREEN), so that program managers can retrospectively categorise defects. The reason for this, is so that QA scripts and documentation tickets can be searched/categorised meaningfully in the tracking system. There seems to be a million possible ways to do this, e.g. should I write a custom component to select functional IDs from a tree? should I write the specs in confluence, or another CMS with a TrackBack facility? should I include a link to the documentation URL? should I use some other 3rd party plugin application? should I use some Atlassian application that i'm unaware of? am I using the wrong tracking tool/process to measure spec growth? What's the best way, in your experience?

    Read the article

  • syntax for MySQL INSERT with an array of columns

    - by Mike_Laird
    I'm new to PHP and MySQL query construction. I have a processor for a large form. A few fields are required, most fields are user optional. In my case, the HTML ids and the MySQL column names are identical. I've found tutorials about using arrays to convert $_POST into the fields and values for INSERT INTO, but I can't get them working - after many hours. I've stepped back to make a very simple INSERT using arrays and variables, but I'm still stumped. The following line works and INSERTs 5 items into a database with over 100 columns. The first 4 items are strings, the 5th item, monthlyRental is an integer. $query = "INSERT INTO `$table` (country, stateProvince, city3, city3Geocode, monthlyRental) VALUES ( '$country', '$stateProvince', '$city3', '$city3Geocode', '$monthlyRental')"; When I make an array for the fields and use it, as follows: $colsx = array('country,', 'stateProvince,', 'city3,', 'city3Geocode,', 'monthlyRental'); $query = "INSERT INTO `$table` ('$colsx') VALUES ( '$country', '$stateProvince', '$city3', '$city3Geocode', '$monthlyRental')"; I get a MySQL error - check the manual that corresponds to your MySQL server version for the right syntax to use near ''Array') VALUES ( 'US', 'New York', 'Fairport, Monroe County, New York', '(43.09)' at line 1. I get this error whether the array items have commas inside the single quotes or not. I've done a lot of reading and tried many combinations and I can't get it. I want to see the proper syntax on a small scale before I go back to foreach expressions to process $_POST and both the fields and values are arrays. And yes, I know I should use mysql_real_escape_string, but that is an easy later step in the foreach. Lastly, some clues about the syntax for an array of values would be helpful, particularly if it is different from the fields array. I know I need to add a null as the first array item to trigger the MySQL autoincrement id. What else? I'm pretty new, so please be specific.

    Read the article

  • Excel: Automating the Selection of an Unknown Number of Cells

    - by user1905080
    I’m trying to automate the formatting of an excel file by a macro and am seeking a solution. I have two columns titled Last Name and First Name which I would like to concatenate into a separate column titled Last Name, First Name. This is simple enough when done by hand: create one cell which does this, then drag that cell to include all cells within the range. The problem appears when trying to automate this. Because I can’t know the number of names that need to be concatenated ahead of time, I can’t automate the selection of cells by dragging. Can you help me automate this? I’ve tried a process of copying the initial concatenated cell, highlighting the column, and then pasting. I’ve also tried to use a formula which returned the concatenation only if there is text in the “Last Name” and “First Name” columns. However, in both cases, I end up with some 100,000 rows, putting a serious cramp on my ability to manipulate the worksheet. The best solution I can think of is to create concatenations within a fixed range of cells. Although this would create useless cells, at least there wouldn’t be 99,900 of them.

    Read the article

  • Why does my binding break down on SilverLight ProgressBars?

    - by Bill Jeeves
    I asked a similar question about charts but I have given up on that and I am using progress bars instead. Essentially, I have ten progress bars in a Silverlight control. Each is showing a different value and updating every couple of seconds (it's a process monitor). Each progress bar has the same minimum value and maximum value so the bars can be compared. Trying to follow the M-V-VM model, I have bound the value of each bar to a property in my ViewModel. All of the maximum values for the bar are bound to a single property. When the model updates, the values and the maximum can all update. This allows the bars to re-scale as the sizes grow. I'm finding that the binding will stop working sometimes on one or more bars. I suspect it is because a bar's value becomes higher than the maximum occasionally. This is because if I update the maximums first and they are going down, the values will be to high. If I update the values first when the maximum needs increasing, the values are too high again. Is there a way to stop this behaviour? Some way, perhaps, to tell the progress bars that it's OK to temporarily go too high? Or some way to tell the bindings that they shouldn't be disabled when this happens? Or maybe I've got this completely wrong and there's some other issue with ProgressBar binding I don't know about?

    Read the article

  • Where to start with the development of first database driven Web App (long question)?

    - by Ryan
    Hi all, I've decided to develop a database driven web app, but I'm not sure where to start. The end goal of the project is three-fold: 1) to learn new technologies and practices, 2) deliver an unsolicited demo to management that would show how information that the company stores as office documents spread across a cumbersome network folder structure can be consolidated and made easier to access and maintain and 3) show my co-workers how Test Drive Development and prototyping via class diagrams can be very useful and reduces future maintenance headaches. I think this ends up being a basic CMS to which I have generated a set of features, see below. 1) Create a database to store the site structure (organized as a tree with a 'project group'-project structure). 2) Pull the site structure from the database and display as a tree using basic front end technologies. 3) Add administrator privileges/tools for modifying the site structure. 4) Auto create required sub pages* when an admin adds a new project. 4.1) There will be several sub pages under each project and the content for each sub page is different. 5) add user privileges for assigning read and write privileges to sub pages. What I would like to do is use Test Driven Development and class diagramming as part of my process for developing this project. My problem; I'm not sure where to start. I have read on Unit Testing and UML, but never used them in practice. Also, having never worked with databases before, how to I incorporate these items into the models and test units? Thank you all in advance for your expertise.

    Read the article

  • How do you send email from IMAP account with PHP?

    - by arthurakay
    I'm having an issue sending email via PHP/IMAP - and I don't know if it's because: I don't correctly understand IMAP, or there's an issue with my server My application opens an IMAP connection to an email account to read messages in the inbox. It does this successfully. The problem I have is that I want to send messages from this account and have them display in the outbox/sent folder. As far as I can tell, the PHP imap_mail() function doesn't in any way hook into the IMAP stream I currently have open. My code executes without throwing an error. However, the email never arrives to the recipient and never displays in my sent folder. private function createHeaders() { return "MIME-Version: 1.0" . "\r\n" . "Content-type: text/html; charset=iso-8859-1" . "\r\n" . "From: " . $this->accountEmail . "\r\n"; } private function notifyAdminForCompleteSet($urlToCompleteSet) { $message = " <p> In order to process the latest records, you must visit <a href='$urlToCompleteSet'>the website</a> and manually export the set. </p> "; try { imap_mail( $this->adminEmail, "Alert: Manual Export of Records Required", wordwrap($message, 70), $this->createHeaders() ); echo(" ---> Admin notified via email!\n"); } catch (Exception $e) { throw new Exception("Error in notifyAdminForCompleteSet()"); } } I'm guessing I need to copy the message into the IMAP account manually... or is there a different solution to this problem? Also, does it matter if the domain in the "from" address is different than that of the server on whicn this script is running? I can't explain why the message is never sent.

    Read the article

  • Speed up csv export when using php from mysql database query

    - by John
    Ok, so i've got a web system (built on codeigniter & running on mysql) that allows people to query a database of postal address data by making selections in a series of forms until they arrive at the selection that want, pretty standard stuff. They can then buy that information and download it via that system. The queries run very fast, but when it comes to applying that query to the database,and exporting it to csv, once the datasets get to around the 30,000 record mark (each row has around 40 columns of which about 20 are all populated with on average 20 chars of data per cell) it can take 5 or so minutes to export to csv. So, my question is, what is the main cause for the slowness? Is it that the resultset of data from the query is so large, that it is running into memory issues? Therefore should i allow much more memory to the process? Or, is there a much more efficient way of exporting to csv from a mysql query that i'm not doing? Should i save the contents of the query to a temp table and simply export the temp table to csv? Or am i going about this all wrong? Also, is the fact that i'm using Codeigniters Active Record for this prohibitive due to the way that it stores the resultset? Any advice is welcome! Thank you for reading!

    Read the article

  • Sorting page flow for has_many in Rails

    - by Gareth
    I have a page flow allowing the user to choose an object ("Player") to add to a has_many :players association in another model. 1 => List existing players for object [Enter player name] 2 => List of matching players [Select player] 3 => Confirmation page [Press 'Add'] 4 => Done I want users to be able to choose "New Player" instead of selecting a player at step 2, in which case the user will go through the standard New Player process elsewhere on the site. However, after that's done, the user should return to step 3 with the new player in place. I don't know what the best way is to implement this. I don't want to duplicate the player creation code, but I don't want to dirty up the player creation code too much just for this case. I also don't want to start sticking IDs in the session if I can help it. It's fine in simple cases but if the user ever has two windows/tabs then things start behaving badly. What do you think?

    Read the article

  • jQuery not working as expected on HTTPS Internet explorer

    - by jat
    When I try to run my code on any other webbrowsers apart from the Internet explorer it works fine. But when I try to run the code on Internet explorer I do get an alert box saying HERE along with an Ok button. but the problem is when I click on that OK button I do not get anything. Ideally I should be getting another alert box. <!DOCTYPE html> <html> <head> <script src="js/jquery-1.4.2.min.js"></script> <script type="text/javascript"> $(document).ready(function(){ $("#submit").click(function(event) { alert("here"); $.post('process.php', {name:'test1',email:'test.com'}, function(data) { $('#results').html(data); alert(data); }); }); }); </script> </head> <body> <form name="myform" id="myform" action="" method="POST"> <label for="name" id="name_label">Name</label> <input type="text" name="name" id="name" size="30" value=""/> <br> <label for="email" id="email_label">Email</label> <input type="text" name="email" id="email" size="30" value=""/> <br> <input type="button" name="submit" id="submit" value="Submit"> </form> <div id="results"><div> </body> </html> Any help on this is very much appreciated. Edit: I found out that Internet Explorer which has HTTP works perfectly fine but not on Internet Explorer which uses HTTPS.

    Read the article

  • Vector Usage in MPI(C++)

    - by lsk1985
    I am new to MPI programming,stiil learning , i was successful till creating the Derived data-types by defining the structures . Now i want to include Vector in my structure and want to send the data across the Process. for ex: struct Structure{ //Constructor Structure(): X(nodes),mass(nodes),ac(nodes) { //code to calculate the mass and accelerations } //Destructor Structure() {} //Variables double radius; double volume; vector<double> mass; vector<double> area; //and some other variables //Methods to calculate some physical properties Now using MPI i want to sent the data in the structure across the processes. Is it possible for me to create the MPI_type_struct vectors included and send the data? I tried reading through forums, but i am not able to get the clear picture from the responses given there. Hope i would be able to get a clear idea or approach to send the data PS: i can send the data individually , but its an overhead of sending the data using may MPI_Send/Recieve if we consider the domain very large(say 10000*10000)

    Read the article

  • detection of 'flush tables with read lock' in php

    - by theduke0
    I would like to know from my application if a myisam table can accept writes (i.e. not locked). If an exception is thrown, everything is fine as I can catch this and log the failed statement to a file. However, if a 'flush tables with read lock' command has been issued (possibly for backup), the query I send will pretty much hang out forever. If one table is locked at a time, insert delayed works well. But when this global lock is applied, my query just waits. The query I run is an insert statement. If this statement fails or hangs, user experience is degraded. I need a way to send the query to the server and forget about it (pretty much). Does anyone have any suggestions on how to deal with this? -set a query timeout? -run asyncronous request and allow for the lock to expire while application continues? -fork my php process? Please let me know if I can provide and clarification or details.

    Read the article

  • lightbox dynamic image retrieval

    - by GSTAR
    I am constructing a lighbox gallery, currently experimenting with FancyBox (http://fancybox.net) and ColorBox (http://colorpowered.com/colorbox). By default you have to include a link to the large version of the image so that the Lightbox can display it. However, I am wanting to have the image link URLs pointing to a script rather than directly to the image file. So for example, instead of: <a href="mysite/images/myimage.jpg"> I want to do: <a href="mysite/photos/view/abc123"> The above URL points to a function: public function actionPhotos($view) { $photo=Photo::model()->find('name=:name', array(':name'=>$view)); if(!empty($photo)) { $user=$photo->user; $this->renderPartial('_photo', array('user'=>$user, 'photo'=>$photo, true)); } } At some point in the future the function will also update the view count of the image. Now this approach is working to an extent - most images load up but some do not load up (the lightbox gets displayed in a malformed state). I think the reason for this is because it is not processing the function quick enough. For example when I click the "next" button it needs to go to the URL, process the function and retreive/output the response. Does anybody know how I can get this working properly?

    Read the article

  • VS2010 + IE8 Debugging woes - Element not found

    - by Chin
    I am having great difficulty trying to debug with vs2010 and IE8, though I think the problem is more IE8 specific. When starting a debug session 9 times out of 10 I will have the following problem. IE tab says connecting.. - then after a 5 second wait I will get an error in VS saying element not found. Even when I click ok to dismiss the error, the IE window still shows connecting... I will then have to kill the IE process to be able to close IE to try again. Sometimes however I am lucky and it starts. But the whole thing is so random I have no clue where to start. One thing I have noticed is that I always have 2 IE processes started even though there is only one window open. One has a small footprint of 100k, I presume it is some kind of helper. I am using a static port with the built in WebDev server. If anyone has had similar problems please let me know how you resolved it. Its driving me nuts! thanks

    Read the article

  • problem after uploading file to web host

    - by Alexander
    I have a .xml file in my App_Data folder, I can access it fine in my localhost, however after uploading it to my webhost I got the following: ASP.NET is not authorized to access the requested resource. Consider granting access rights to the resource to the ASP.NET request identity. ASP.NET has a base process identity (typically {MACHINE}\ASPNET on IIS 5 or Network Service on IIS 6) that is used if the application is not impersonating. If the application is impersonating via , the identity will be the anonymous user (typically IUSR_MACHINENAME) or the authenticated request user. To grant ASP.NET access to a file, right-click the file in Explorer, choose "Properties" and select the Security tab. Click "Add" to add the appropriate user or group. Highlight the ASP.NET account, and check the boxes for the desired access. So who should I grant access to read? There's 3 user in my web host and all seems to have read access set to them.. one is NETWORK_SERVICE, IUSR_MACHINENAME and my user. So what is wrong?

    Read the article

  • deployd authentification using jquery ajax

    - by user2507987
    I have installed deployd in my debian 7.0.0 64 bit, I have also succesfully installed mongodb in it, I have create some collection and user collection in deployd dashboard, then using user guide how to connect and query the table in deployd, I choose jquery ajax to log in to deployd from my localhost site and after login success I try to get/post some data, but somehow deployd return access denied. I have create collection name it people, and then at the GET, POST, PUT Event I have write this code : cancelUnless(me, "You are not logged in", 401); then using this ajax code, I try to login and POST new people data: $(document).ready(function(){ /* Create query for username and password for login */ var request = new Object; request.username = 'myusername'; request.password = 'mypassword'; submitaddress = "http://myipaddress:myport/users/login"; $.ajax({ type: "POST", url: submitaddress, data: request, cache: false, success: function(data){ var returndata = eval(data); /* After Login success try to post people data */ if (returndata){ var request2 = new Object; request2.name = 'People Name'; submitaddress2 = "http://myipaddress:myport/people"; $.ajax({ type: "POST", url: submitaddress2, data: request2, cache: false, success: function(){ } }) } } } }); }) The login process success, it's return session id and my user id, but after login success and I try to POST people data it's return "You are not logged in", can anyone help me, what is the correct way to login to deployd using jquery from other website(cross domain)?

    Read the article

  • JQuery/AJAX: Loading external DIVs using dynamic content

    - by ticallian
    I need to create a page that will load divs from an external page using Jquery and AJAX. I have come across a few good tutorials, but they are all based on static content, my links and content are generated by PHP. The main tutorial I am basing my code on is from: http://yensdesign.com/2008/12/how-to-load-content-via-ajax-in-jquery/ The exact function i need is as follows: Main page contains a permanent div listing some links containing a parameter. Upon click, link passes parameter to external page. External page filters recordset against parameter and populates div with results. The new div contains a new set of links with new parameters. The external div is loaded underneath the main pages first div. Process can then be repeated creating a chain of divs under each other. The last div in the chain will then direct to a new page collating all the previously used querystrings. I can handle all of the PHP work with populating the divs on the main and external pages. It's the JQuery and AJAX part i'm struggling with. $(document).ready(function(){ var sections = $('a[id^=link_]'); // Link that passes parameter to external page var content = $('div[id^=content_]'); // Where external div is loaded to sections.click(function(){ //load selected section switch(this.id){ case "div01": content.load("external.php?param=1 #section_div01"); break; case "div02": content.load("external.php?param=2 #section_div02"); break; } }); The problem I am having is getting JQuery to pass the dynamically generated parameters to the external page and then retrieve the new div. I can currently only do this with static links (As above).

    Read the article

  • WINDOWS: Your computer hangs. You can windows + R (run dialog) but performance is so halted taskMGR

    - by John Sullivan
    The question is, what process are available to try to recover from total system instability before pulling the plug when we can do nothing but programs or batches in the path from the run dialog (windows + r key), and performance is so dead that taskMGR / procEXP / other programs with visual guis are not usable? I am not a windows expert, but ideally someone out there has written a program that does more or less stuff like this: Immediately set (or perhaps I can set from the run prompt) its priority to extremely high, evaluate performance bottlenecks. E.g. is CPU 100%? If so identify offending program(s) or problems. Attempt / log fixes, then provide crude feedback asking the user if his performance has stabilized enough to abort, wait a few seconds, if no feedback continue, etc. etc. Eventually try to do any "system cleanup" if the program decides it cannot recover and perhaps finally provide a series of beeps to the user, or what have you, to say "OK, I give up, time to pull the plug". Ideally create a log, when able. These kinds of horrible hangs are a situation where surely trying something, anything, is better than nothing -- as long as that something is intelligent -- when the alternative is ripping out the power coord. Again, I am not a windows expert, so perhaps there is a much more elegant "hands on" approach I am not aware of.

    Read the article

  • Apache's AuthDigestDomain and Rails Distributed Asset Hosts

    - by Jared
    I've got a server I'm in the process of setting up and I'm running into an Apache configuration problem that I can not get around. I've got Apache 2.2 and Passenger serving a Rails app with distributed asset hosting. This is the feature of Rails that lets you serve your static assets from assets0.example.com, assets1, assets2, and so on. The site needs to be passworded until launch. I've set up HTTP authentication on the site using Apache's mod_auth_digest. In my configuration I'm attempting to use the AuthDigestDomain directive to allow access to each of the asset URLs. The problem is, it doesn't seem to be working. I get the initial prompt for the password when I load the page, but then the first time it loads an asset from one of the asset URLs, I get prompted a 2nd, 3rd, or 4th time. In some browsers, I get prompted for every single resource on the page. I'm hoping that this is only a problem of how I'm specifying my directives and not a limitation of authorization in Apache itself. See the edited auth section below: <Location /> AuthType Digest AuthName "Restricted Site" AuthUserFile /etc/httpd/passwd/passwords AuthGroupFile /dev/null AuthDigestDomain / http://assets0.example.com/ http://assets1.example.com/ http://assets2.example.com/ http://assets3.example.com/ require valid-user order deny,allow allow from all </Location>

    Read the article

  • AJAX Uploading - Not waiting for response before continuing

    - by waxical
    I'm using Blueimp's jQuery Uploader (very good it is too btw) and an S3 handler to upload files and then transfer them to S3 via the S3 API (from the PHP SDK). It works. The problem is, on large files (1GB) it can take anything up to a a few minutes to transfer (via create-object) onto S3. The PHP file that does this is hung-up until this process is complete. The problem is, the uploader (which utilises the jQuery Ajax method) seems to give up waiting and start again everytime. I have thought this was related to PHP INI 'max_input_time' or such, as it seemed to wait around 60 seconds, though this now appears to vary. I have upped the max_input_time in PHP INI and others related - but no further. I've also considered (the more likely) that JS, either in the script or the jQuery method has a timeout. The developer (blueimp) has said there's no such timeout in the front-end script, nor have I seen any and though 'timeout' is referenced in the jQuery Ajax method options, it seems to affect the entire time it uploads rather than the wait for a response - so that's not much use. Any help or guidance gratefully received.

    Read the article

  • How can I quickly parse large (>10GB) files?

    - by Andrew
    Hi - I have to process text files 10-20GB in size of the format: field1 field2 field3 field4 field5 I would like to parse the data from each line of field2 into one of several files; the file this gets pushed into is determined line-by-line by the value in field4. There are 25 different possible values in field2 and hence 25 different files the data can get parsed into. I have tried using Perl (slow) and awk (faster but still slow) - does anyone have any suggestions or pointers toward alternative approaches? FYI here is the awk code I was trying to use; note I had to revert to going through the large file 25 times because I wasn't able to keep 25 files open at once in awk: chromosomes=(1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25) for chr in ${chromosomes[@]} do awk < my_in_file_here -v pat="$chr" '{if ($4 == pat) for (i = $2; i <= $2+52; i++) print i}' >> my_out_file_"$chr".query done

    Read the article

  • Using LINQ, need help splitting a byte array on data received from Silverlight sockets

    - by gcadmes
    The message packats received contains multiple messages deliniated by a header=0xFD and a footer=0xFE // sample message packet with three // different size messages List<byte> receiveBuffer = new List<byte>(); receiveBuffer.AddRange(new byte[] { 0xFD, 1, 2, 0xFE, 0xFD, 1, 2, 3, 4, 5, 6, 7, 8, 0xFE, 0xFD, 33, 65, 25, 44, 0xFE}); // note: this sample code is without synchronization, // statements, error handling...etc. while (receiveBuffer.Count > 0) { var bytesInRange = receiveBuffer.TakeWhile(n => n != 0xFE); foreach (var n in bytesInRange) Console.WriteLine(n); // process message.. // 1) remove bytes read from receive buffer // 2) construct message object... // 3) etc... receiveBuffer.RemoveRange(0, bytesInRange.Count()); } As you can see, (including header/footer) the first message in this message packet contains 4 bytes, and the 2nd message contains 10 bytes,a and the 3rd message contains 6 bytes. In the while loop, I was expecting the TakeWhile to add the bytes that did not equal the footer part of the message. Note: Since I am removing the bytes after reading them, the header can always be expected to be at position '0'. I searched examples for splitting byte arrays, but non demonstrated splitting on arrays of unknown and fluctuating sizes. Any help will be greatly appreciated. thanks much!

    Read the article

  • VB.Net Memory Issue

    - by Skulmuk
    We have an application that has some interesting memory usage issues. When it first opens, the program uses aroun 50-60MB of memory. This stays consistent on 32-bit machines. On 64-bit machines, however, re-activating the form in any way (clicking, dragging, alt-tabbing, etc.) adds around another 50MB to it's memory usage. It repeats this process several times before resetting back to around 45MB, at which point the cycle begins again. I've done some research and a lot of people have said that VB in general has pretty poor garbage collection, which could be affecting the software in some way. However, I've yet to find a solution. There are no events fired when the application is activated (as shown by 32-bit usage) - the applications is merely sitting awaiting the user's actions. At load, the system pulls some data into a tree view, but that's the only external connection, and it only re-fires the routine when the user makes a change to something and saves the change. Has anyone else experienced anything this strange, and if so, does anyone know of what might fix it? It seems strange that it only occurs under x64 systems. Thanks

    Read the article

< Previous Page | 798 799 800 801 802 803 804 805 806 807 808 809  | Next Page >