Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 1532/1620 | < Previous Page | 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539  | Next Page >

  • How do I gather data from the same collumn in multiple worksheets in a single workbook?

    - by infiniteloop91
    Okay so here is what I want to accomplish. For this example I have a single workbook composed of 4 data sheets plus a totals sheet. Each of the 4 data sheets has a similar name following the same pattern where the only difference is the date. (Ex. 9854978_1009_US.txt, where 1009 is the date that changes while the rest of the file name is the same). In each of those documents column F contains a series of number that I would like to find the sum of but I will have no idea how many cells in F actually contain numbers. (However there will never be additional information below it the numbers so I could in theory just add the entire F column together). I will also add new files to the workbook over time and do not want to have to rewrite the code of which I gather my data from column F. Essentially what I would like to accomplish is for the 'totals' document to take every column F from documents in the workbook with the name of '9854978_????_US.txt', where the question marks change based on the file name. How would I go about doing this in pure Excel code?

    Read the article

  • How best to organize projects folders for unit tests in .NET?

    - by Dan Bailiff
    So I'm trying to introduce unit testing to my group. I've successfully upgraded a VS'05 web site project to a VS'08 web application, and now have a solution with the web app project and a unit test project. The issue now is how to fit this back into the source repository such that we don't break the build system and the unit test projects are persisted as well. Right now we have something like this: c:\root c:\root\projectA c:\root\projectB c:\root\projectC where projectA contains the sln file and all other related files/folders for the project. Now I have this new solution that looks like this: c:\root\projectA (parent folder) c:\root\projectA\projectA (the production code project) c:\root\projectA\projectA_Test (the unit test project) c:\root\projectA\TestResults c:\root\projecta\projectA.sln How do I integrate this new structure back into the code repository? I'd really prefer to keep the production code folder where it was in the source repository for the sake of the build, but is this necessary? If I keep the production code project in its usual place then where do I keep my unit test projects and how do I connect them with a sln file? Is it better to use this new structure and adjust the build process? I'd love to hear how other people are dealing with this issue of upgrading legacy projects to unit testing.

    Read the article

  • Usage of setInfoClass() on DirectoryIterator vs on RecursiveDirectoryIterator

    - by Gordon
    I've ran into an inconsistent behavior when using setInfoClass to set a custom SplFileInfo class to a DirectoryIterator versus setting it to a RecursiveIterator. The method description states: Use this method to set a custom class which will be used when getFileInfo and getPathInfo are called. The class name passed to this method must be derived from SplFileInfo. Consider this custom SplFileInfo class A extends SplFileInfo { public function test() { printf("I am of class %s\n", __CLASS__); } } and my iterators $iterator = new DirectoryIterator('.'); and $iterator = new RecursiveDirectoryIterator('.'); Now I'd expect those two to behave the same when I do $iterator->setInfoClass('A'); foreach($iterator as $file) { $file->test(); } and output 'I am of A' for each $file encountered and in fact, the RecursiveDirectoryIterator will do that. But the DirectoryIterator will raise Fatal error: Call to undefined method DirectoryIterator::test() so apparently the InfoClass does not get applied when iterating over the files. At least not directly, because when I change the code in the foreach loop to $file->getPathInfo()->test(); it will work for the DirectoryIterator. But then the RecursiveDirectoryIterator will raise Fatal error: Call to undefined method SplFileInfo::test() Like I said, I'd expect those two to behave the same, but apparently getFileInfo and getPathInfo don't get called in the DirectoryIterator, which I consider a bug. So if there is any Iterator experts out there, please help me understand this. Thanks.

    Read the article

  • TPL - Using static method vs struct method

    - by Sunit
    I have about 1500 files on a share for which I need to collect FileVersionInfo string. So I created a Static method in my Gateway like this: private static string GetVersionInfo(string filepath) { FileVersionInfo verInfo = FileVersionInfo.GetVersionInfo(filepath); return string.Format("{0}.{1}.{2}.{3}", verInfo.ProductMajorPart, verInfo.ProductMinorPart, verInfo.ProductBuildPart, verInfo.ProductPrivatePart).Trim(); } And then used FileAndVersion struct in a PLINQ call with DegreeOfParallelism as this is I/O related resultList = dllFilesRows.AsParallel().WithDegreeOfParallelism(20) .Select(r => { var symbolPath = r.Filename; return new FilenameAndVersion{Filename=symbolPath, Version=GetVersionInfo(symbolPath)}; }) .ToArray(); Later I modified the Struct, FileAndVersion as: private struct FilenameAndVersion { private string _version, _filename; public string Version { get { return _version; } } public string Filename { get { return _filename; } } public void SetVersion() { FileVersionInfo verInfo = FileVersionInfo.GetVersionInfo(this.Filename); this._version = string.Format("{0}.{1}.{2}.{3}", verInfo.ProductMajorPart, verInfo.ProductMinorPart, verInfo.ProductBuildPart, verInfo.ProductPrivatePart).Trim(); } public FilenameAndVersion(string filename, string version) { this._filename = filename; this._version = string.Empty; SetVersion(); } } And used it: resultList = dllFilesRows.AsParallel().WithDegreeOfParallelism(20) .Select(r => { var symbolPath = r.Filename; return new FilenameAndVersion(symbolPath, String.Empty); }) .ToArray(); The question is, is this going to help me in anyway and is a good pattern to use ? Sunit

    Read the article

  • Most Efficient way to deal with multiple CCSprites?

    - by nardsurfer
    I have four different types of objects within my environment(box2d), each type of object having multiple instances of itself, and would like to find the most efficient way to deal with adding and manipulating all the CCSprites. The sprites are all from different files, so would it be best to create each sprite and add it to a data structure (NSMutableArray) or would I use a CCSpriteBatchNode even though each CCSprite file is different (for each type of object)? Thanks. @interface LevelScene : CCLayer { b2World* world; GLESDebugDraw *m_debugDraw; CCSpriteBatchNode *ballBatch; CCSpriteBatchNode *blockBatch; CCSpriteBatchNode *springBatch; CCSprite *goal; } +(id) scene; // adds a new sprite at a given coordinate -(void) addNewBallWithCoords:(CGPoint)p; // loads the objects (blocks, springs, and the goal), returns the Level Object -(Level) loadLevel:(int)level; @end or using NSMutableArray objects within the Level object... @interface zLevel : zThing { NSMutableArray *springs; NSMutableArray *blocks; NSMutableArray *balls; zGoal *goal; int levelNumber; } @property(nonatomic,retain)NSMutableArray *springs; @property(nonatomic,retain)NSMutableArray *blocks; @property(nonatomic,retain)NSMutableArray *balls; @property(nonatomic,retain)zGoal *goal; @property(nonatomic,assign)int levelNumber; -(void)initWithLevel:(int)level; -(void)loadLevelThings; @end

    Read the article

  • PHP include taking too long

    - by wxiiir
    I have a php file with around 100mb which is full of arrays (only arrays). I've made a script that includes this file (for processing), first it exhausted the default Xampp 128mb memory limit, i've raised it to 1024mb but it just takes forever and doesn't do anything. I'm sure the problem is being created by the sheer size of the file because i've tried removing all lines of code and just leaving the include and an echo for me to know when it finishes executing, and it does the same thing (which is taking forever), i've also tried to run the 100mb file in separate and same thing again. A 10mb file is taking forever as well but a similar 1mb file is almost instantly read and executed so the problem must be more than just the file size. I was avoiding using c++ for a simple project as this and would rather not to as php is easier for me and the task that will be executed doesn't need to benefit from the added speed that it would have if it had been done in c++ but if i have no luck in solving this problem i guess i'll have to. EDIT Reasons for not using a database: 1-Whoever made it didn't used a database and it will be pretty hard to store this in an organized database if i'm not able to do something with it first, like just reading it, copying parts from it or putting in memory or something. 2-I don't have experience working with databases as pretty much all stuff i've ever done in php didn't needed large amounts of stored data, 50kb at best, if i was thinking about a big project or huge chunks of data as this one, i definitely would, but i didn't made this mess to start with and now i have to undo it. 3-The logic for having to store a small portion of data like 10mb in hard drive when now every computer has pretty much enough ram to fit the whole OS in it is pretty much incomprehensible unless someone gives a good explanation about it, if i had to access a lot of said files simultaneously i would understand but like i said, this is a simple project, this is the only file that will be accessed at a given time this isn't even to make some kind of website, it's to run a few times and be done with it.

    Read the article

  • File uploads simply do not work - what could be wrong server side?

    - by vanneto
    This has been grinding my gears for at least a week now. I have a site which has a crucial function - the upload component. Without it, the site is completely useless. Now, lots of users have problems uploading files. This is why I implemented a log system that keeps track of what happens when the file is on the server. Problem is, only a minority of problems happen when the file is on the server, the problems happen client side. For example, when I tried SWFUpload the errors where 'I/O Error'. When I changed to Uploadify I get 'HTTP Error'. I am trying to get a more detailed error description as I type this. I am starting to think its not the client or the upload software, but the server. What could be wrong? The following directives for PHP are set: upload_max_filesize 200M post_max_size 200M memory_limit 256M max_execution_time 4200 max_input_time 4200 I simply have no clue why file uploads are failing. They should not fail. I would really appreciate any answers as why the uploads could be failing. Thank you.

    Read the article

  • PC to Macbook Pro Transition - Getting (re)started?

    - by Torus Linvald
    I'm in my second computer science course right now. I've enjoyed programming so far, but really have just scraped my way by. I've not done much programming outside of required class work. For similar reasons, I never really invested in downloading/learning software to help me program (IDE's, editors, compilers, etc). I know it sounds tedious, but my current setup is: notepad++ for coding; Filezilla to transfer .cpp & .h files to school's aludra/unix and compiling; unix tells me where my bugs are and I go back to notepad++ to debug; repeat until done. This isn't fun - and I know it could be easier. But I put it off knowing that I was soon going to switch to a Mac. And, tomorrow, I'm switching. So... How should I set up my Macbook for the best programming experience? What IDEs and editors and debuggers and so on should I download? How will Mac programming differ from PC? I'm open to all ideas and comments, even the most basic. (Background - I'm learning/programming in C++ right now. Next semester, my classes switch to Java. I'm also going to take a class in web development, with HTML/CSS/Javascript/PHP. My new laptop will be a late 2009 Macbook Pro with Leopard, or maybe Snow Leopard. Free would be preferrable for all programs.) Thank you all.

    Read the article

  • php - using file() incrementally?

    - by NeedBeerStat
    I'm not sure if this is possible, I've been googling for a solution... But, essentially, I have a very large file, the lines of which I want to store in an array. Thus, I'm using file(), but is there a way to do that in batches? So that every,say, 100 lines it creates, it "pauses"? I think there's likely to be something I can do with a foreach loop or something, but I'm not sure that I'm thinking about it the right way... Like $i=0; $j=0; $throttle=100; foreach($files as $k => $v) { if($i < $j+$throttle && $i > $j) { $lines[] = file($v); //Do some other stuff, like importing into a db } $i++; $j++; } But, I think that won't really work because $i & $j will always be equal... Anyway, feeling muddled... Can someone help me think a lil' clearer?

    Read the article

  • Developers: How does BitLocker affect performance?

    - by Chris
    I'm an ASP.NET / C# developer. I use VS2010 all the time. I am thinking of enabling BitLocker on my laptop to protect the contents, but I am concerned about performance degradation. Developers who use IDEs like Visual Studio are working on lots and lots of files at once. More than the usual office worker, I would think. So I was curious if there are other developers out there who develop with BitLocker enabled. How has the performance been? Is it noticeable? If so, is it bad? My laptop is a 2.53GHz Core 2 Duo with 4GB RAM and an Intel X25-M G2 SSD. It's pretty snappy but I want it to stay that way. If I hear some bad stories about BitLocker, I'll keep doing what I am doing now, which is keeping stuff RAR'ed with a password when I am not actively working on it, and then SDeleting it when I am done (but it's such a pain).

    Read the article

  • Why isn't this json/jquery function returning 10 results and why does it return 0 sometimes?

    - by Scarface
    Hey guys, quick question. I have a json function returning results from a php query but it is not working out because it seems to keep executing the query until all results are returned and sometimes returns 0 results (if I keep refreshing page, it returns 0 results sometimes). I want to limit results returned to 10 so I was wondering if anyone could point out the error of my code. Would really appreciate it. PS I am positive the query is returning the correct results and only 10 on its own. It just seems to be this one function that is not working out. SELECT time, user, comment FROM comments WHERE topic_id='$topic_id' ORDER BY time DESC LIMIT 10 var count=0; function prepare(response) { count++; var string = '<li class="list" id="list-'+count+'">' +'</li>'; return string; } $.getJSON(files+"comments.php?action=view&load=initial&topic_id="+topic_id+"&t=" + (new Date()), function(json) { if(json.length) { for(i=0; i < json.length; i++) { $('#shoutbox-list').prepend(prepare(json[i])); $('#list-' + count).fadeIn(1500); } var j = i-1; lastTime = json[j].time; } });

    Read the article

  • How to use git to manage one codebase but have different environments

    - by emostar
    I'm using git for a personal project at the moment and have run into a problem of having one codebase for two different environments and was wondering what the cleanest way to use git would be. Main Desktop I Use this machine for most of my development. I have a git repository here that I cloned off of an empty repository that I use on my internal server. I do most of my work here and push back to the internal server so I can use that as a master of truth and to ease making backups. Laptop I sometimes want to code on the road, so I did a clone from the internal server and created a new branch called "laptop-branch". Unfortunately some directories MSVC++ version are different than from the Main Desktop environment. I just modified the files in the "laptop-branch" and committed them there. Now I did a lot of changes while on vacation with my laptop, and want to push them to origin, but don't want the changes I made that were related to directories and compiler versions to be pushed back to origin. What would be the best way to get this done?

    Read the article

  • Speed of Synchronization vs Normal

    - by Swaranga Sarma
    I have a class which is written for a single thread with no methods being synchronized. class MyClass implements MyInterface{ //interface implementation methods, not synchronized } But we also needed a synchronized version of the class. So we made a wrapper class that implements the same interface but has a constructor that takes an instance of MyClass. Any call to the methods of the synchronized class are delegated to the instance of MyClass. Here is my synchronized class.. class SynchronizedMyClass implements MyInterface{ //the constructor public SynchronizedMyClass(MyInterface i/*this is actually an instance of MyClass*/) //interface implementation methods; all synchronized; all delegated to the MyInterface instance } After all this I ran numerous amounts of test runs with both the classes. The tests involve reading log files and counting URLs in each line. The problem is that the synchronized version of the class is consistently taking less time for the parsing. I am using only one thread for the teste, so there is no chance of deadlocks, race around condition etc etc. Each log file contains more than 5 million lines which means calling the methods more than 5 million times. Can anyone explain why synchronized versiuon of the class migt be taking less time than the normal one?

    Read the article

  • Howto display or view encrypted data in encrypted form?

    - by Brian Gianforcaro
    In the Wikipedia Article on Block Cipher Modes they have a neat little diagram of an unencrypted image, the same image encrypted using ECB mode and another version of the same image encrypted using another method. At university I have developed my own implementation of DES (you can find it here) and we must demo our implementation in a presentation. I would like to display a similar example as shown above using our implementation. However most image files have header blocks associated with them, which when encrypting the file with our implementation, also get encrypted. So when you go to open them in an image viewer, they are assumed to be corrupted and can't be viewed. I was wondering if anybody new of a simple header-less image format which we could use to display these? Or if anyone had any idea's as to how the original creator of the images above achieved the above result? Any help would be appreciated, Thanks Note: I realise rolling your own cryptography library is stupid, and DES is considered broken, and ECB mode is very flawed for any useful cryptography, this was purely an academic exercise for school. So please, no lectures, I know the drill.

    Read the article

  • How should I be using IoC in this winform piece of code?

    - by Pure.Krome
    Hi folks, I've got a winform app in visual studio 2010. My app does the following Get a list of files which I need to read the data then insert into a database. For each file, read data and insert into DB. So .. this is the code i have. var list = _repository.GetFileList(); if (list != null) { int i = 0; foreach(var file in list) { i++; var service = new MyService(i, _repository); service.ParseAndSave(); } } So i was hoping to have a new repository for each 'service' i create. Firstly, i'm not sure if I should be using IoC in this case. I believe I should be because then i don't need to tightly couple this winform to a repository. Secondly, I've tried using a Singleton repo, which I don't want and can confirm that it kills that code (crashes with an exception). Some other notes (which shouldn't impact this question) - Using Entity Framework for ASP.NET 4. - Using StructureMap for IoC Can someone help, please?

    Read the article

  • Please suggest other ways of communicating between server & client.

    - by Geo
    I'm writing a TCP chat server ( programming language does not mather ). It's a school project for my nephew, so it won't be released, and all questions I'm asking are just for my knowledge :). . Some of the things it will support: chatting between users ( doh ), it will be multithreaded sending each other files I know I could easily get away with all the stuff above if I go with serialization, and just send objects from client to server and back. But, if I do that, it will be limited to a specific programming language ( meaning clients written in other programming languages may not be able to deserialize the objects ). What would be the way to go so that other clients written in other languages could be supported? One way to go, off the top of my head, would be to go in this direction: the server & the client communicate by sending messages & chunks ( in lieu of other names ). Here's what I mean by this: every time the client/server wants to send something ( text message or file ) it will first send a simple text message ( newline terminated ) with the number of the chunks it will send. Example: command 4,20,30,40,50 Where command would be something like instant-message or file,4 would be the number of chunks to be sent, 20 would be the size in bytes of the first chunk, 30 of the 2nd, and so forth. after the message was sent, the client/server will start sending chunks ( of sizes mentioned in the sent message ). What do you think about implementing the client/server communication this way? What better options are there?

    Read the article

  • c++ simple conditional logging

    - by Sunny
    Disclaimer: I'm not a c++ developer, I can only do basic things. (I understand pointers, just my knowledge is so rusty, I haven't touch c/c++ for about 20 years :) ) The setup: I have an Outlook addin, written in C#/.Net 1.1. It uses a c++ shim to load. Usually, this works pretty well, and I use in my c# code nlog for logging purposes. But sometimes, the addin fails to load, i.t. it does not hit the managed code at all for me to be able to investigate the problem from the log files. So, I need to hook some basic logging into the c++ shim - just writing in a file. I need to make it as simple as possible for our users to enable. Actually I would prefer not to ship it by default. I was thinking about something, which will check if a specific dll is present (the logging dll), and if so, to use it. Otherwise, it will just not log anything. That way, when I have a user with such a problems, I can send him only the logging dll, the user will save it in the runtime directory, and I'll have the file. I guess this have to be done with some form a factory solution, which returns either a dummy logger, or if the dll is found, a real one. Another option would be to make some simple logger, and rebuild the shim with or w/o using it, based on directives. This is not the desirable approach, because the shim needs to be signed, and I have to instruct the user to make a backup copy of the "real" one, then restore when done, etc., instead of just saving and deleting a dll. I'd appreciate any good suggestion how to approach it, together with links or sample code how to go after this. Cheers

    Read the article

  • How can I programmatically get the connection status of OSX network services?

    - by BigBrainz
    In the OS X System Preferences, when I click on 'Network' I see a green dot by 'Ethernet', and red dots by 'AirPort' and 'FireWire'. This is because I turned off AirPort and FireWire, as I access networks and the Internet via Ethernet. I need to programmatically determine which of these network services displayed in System Preferences have green dots and which have red dots. For Ethernet and FireWire the displayed status is 'Connected' or 'Not Connected', and for AirPort the displayed status is 'On' or 'Off'. Perhaps other network services have other status labels. I have picked through all the plist files in '/Library/Preferences/SystemConfiguration', particularly 'preferences.plist' and 'NetworkInterfaces.plist'. I can get all sorts of information there, such as the Location set, network service order, proxy information (which is also important to my task), but I cannot find how to determine whether a given network service is on or off--the equivalent of having the green dot displayed. I have also tried using System Configuration framework, specifically the SCNetworkConnectionGetStatus function, but all I get are invalid connection statuses. Does anyone know how to actually retrieve this connection status information? Thanks.

    Read the article

  • MATLAB, time match filter

    - by Paul
    OK, I am still getting the hang of MATLAB. I have two files in different format. One Excel file. data1.xls, size= 86400 X 62. It looks like: Date/Time par1 par2 par3 par4 par5 par6 par6 par7 par8 par9 08/02/09 00:06:45 0 3 27 9.9 -133.2 0 0 0 1 0 Another file, data2.csv, size = 144 X 27. (If nothing is missing.) It looks like: date time P01 P02 P03 P04 P05 P06 P07 P08 P09 P10 P11 8/16/2009 0:00 51 45 46 54 53 52 524 5 399 89 78 Now I am using Data10minAvg = mean(reshape(Data,300,144,62)); to get the 10 min average of the first Excel file. Now I need to match up that file I am making above with the .csv file. The problem is many timestamps are missing in the .csv file. How do I make data2.csv into a file of size 144 X 27, replacing the missing datestamps by rows of zero? It will really help me than compare data1.xls file with newdata2.csv.

    Read the article

  • Can Delphi 5 generate a .PDB file that VS can use?

    - by Vilx-
    We've got this large application written in Delphi 5, and development is ongoing to this day. There is research going on into migrating to newer versions, but so far there is no success, as some 3rd party components have not been updated in ages and do not work on later versions. In the meantime however people need to continue work on it. Now Delphi 5 IDE is no real treat. It's pretty bug-ridden and lacks a lot of features of contemporary IDEs which makes it difficult to use. Especially when it comes to debugging. So I was wondering - would it be possible to use Visual Studio in the process? As far as I know the .PDB file format is pretty old and is well documented. Could it be possible to make the Delphi compiler to somehow generate a .PDB files for it's compiled results? Then the program could be debugged with Visual Studio, possibly to a much greater extent than in the original IDE. Well, the absolute Holy Grail would be to move all development to VS, just keeping the compiler from Delphi, but I imagine that would be pretty impossible.

    Read the article

  • Windows Unique Identifier?

    - by user775013
    So there is this software. When installed it somehow (probably reads file or registry entry) recognizes my windows operating system. It's supposed to do some tasks only once per unique computer. If I uninstall the program and re install it, the software remembers that it has been installed and therefore do not do the task. If I use system restore, software also does not do the tasks. If I load image of the system before the install, software also doesn't do the tasks. If I re install a fresh copy of windows, then only the software does the task. IP even does not matter. Everything is the same, except it is a brand new copy of Windows operating system. So I guess that the software reads some kind of unique operating system identifier, then connects to server to create a user profile. So the question is? What could be those files which software uses to check system identifier? So far I have found out that there are entries under registry. WindowsNT/CurrentVersion and Windows/Cryptography but software do not rely on them. Where else should I search? Any software which could help me find out?

    Read the article

  • Java Equivalent of C++ .dll?

    - by Matt D
    So, I've been programming for a while now, but since I haven't worked on many larger, modular projects, I haven't come across this issue before. I know what a .dll is in C++, and how they are used. But every time I've seen similar things in Java, they've always been packaged with source code. For instance, what would I do if I wanted to give a Java library to someone else, but not expose the source code? Instead of the source, I would just give a library as well as a Javadoc, or something along those lines, with the public methods/functions, to another programmer who could then implement them in their own Java code. For instance, if I wanted to create a SAX parser that could be "borrowed" by another programmer, but (for some reason--can't think of one in this specific example lol) I don't want to expose my source. Maybe there's a login involved that I don't want exploited--I don't know. But what would be the Java way of doing this? With C++, .dll files make it much easier, but I have never run into a Java equivalent so far. (I'm pretty new to Java, and a pretty new "real-world" programmer, in general as well)

    Read the article

  • __declspec(dllimport) causes compiler crash on MSVC 2010

    - by Zero
    In a *.cpp file, trying to use a third party lib: #define DLL_IMPORT #include <thirdParty.h> // Third party header has code like: // #ifdef DLL_IMPORT // #define DLL_DECL __declspec(dllimport) // fatal error C1001: An internal error has occurred in the compiler. Alternative: #define NO_DLL #include <thirdParty.h> // Third party header has code like: // #elif defined(NO_DLL) // #define DLL_DECL // Compiles fine, but linker errors as can't find DLL functions // I can reproduce results by remove macros and #define all together and manually editing the third party files to have __declspec(dllimport) or not Has anyone come across anything similar, or can hint at the cause? (which is created using CMake). Above is actual example of 2 line *.cpp that crashes so it's narrowed down to something in the #include. The following also work fine: Compile the examples provided by the third party (they provide a *.sln) that use dllimport/export so it doesn't appear to be the fault of the library Compile the third party lib as part of the production project (so dllexport works fine) I've trawled the project settings pages of the two projects to try and spot differences, but have come up blank. Of course, it's possible I'm missing something as those settings pages are not the easiest to navigate. I'll get access to VS2008 in a day or so, so can compare with that. The third party library is MySql++.

    Read the article

  • Sheet and thread memory problem

    - by Xident
    Hi Guys, recently I started a project which can export some precalculated Grafix/Audio to files, for after processing. All I was doing is to put a new Window (with progressindicator and an Abort Button) in my main xib and opened it using the following code: [NSApp beginSheet: REC_Sheet modalForWindow: MOTHER_WINDOW modalDelegate: self didEndSelector: nil contextInfo: nil]; NSModalSession session=[NSApp beginModalSessionForWindow:REC_Sheet]; RECISNOTDONE=YES; while (RECISNOTDONE) { if ([NSApp runModalSession:session]!=NSRunContinuesResponse) break; usleep(100); } [NSApp endModalSession:session]; A Background Thread (pthread) was started earlier, to actually perform the work and save all the targas/wave file. Which worked great, but after an amount of time, it turned out that the main thread was not responding anymore and my memory footprint raised unstoppable. I tried to debug it with Instruments, and saw a lot of CFHash etc stuff growing to infinity. By accident i clicked below the sheet, and temporary it helped, the main thread (AppKit ?) was releasing it's stuff, but just for a little time. I can't explain it to me, first of all I thought it was the access from my thread to the Progressbar to update the Progress (intervalled at 0,5sec), so I cut it out. But even if I'm not updating anything and did nothing with the Progressbar, my Application eat up all the Memory, because of not releasing it's "Main-Event" or whatsoever Stuff. Is there any possibility to "drain" this Main thread Memory stuff (Runloop / NSApp call?). And why the heck doesn't the Main thread respond anymore (after this simple task) ??? I don't have a clou anymore, please help ! Thanks in advance ! P.S. How do you guys implement "threaded long task" Stuff and updating your gui ???

    Read the article

  • Error "initializer element is not constant" when trying to initialize variable with const

    - by tomlogic
    I get an error on line 6 (initialize my_foo to foo_init) of the following program and I'm not sure I understand why. typedef struct foo_t { int a, b, c; } foo_t; const foo_t foo_init = { 1, 2, 3 }; foo_t my_foo = foo_init; int main() { return 0; } Keep in mind this is a simplified version of a larger, multi-file project I'm working on. The goal was to have a single constant in the object file, that multiple files could use to initialize a state structure. Since it's an embedded target with limited resources and the struct isn't that small, I don't want multiple copies of the source. I'd prefer not to use: #define foo_init { 1, 2, 3 } I'm also trying to write portable code, so I need a solution that's valid C89 or C99. Does this have to do with the ORGs in an object file? That initialized variables go into one ORG and are initialized by copying the contents of a second ORG? Maybe I'll just need to change my tactic, and have an initializing function do all of the copies at startup. Unless there are other ideas out there?

    Read the article

< Previous Page | 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539  | Next Page >