What's the most efficient way to load data from a file to a collection on-demand?

Posted by Dan on Stack Overflow See other posts from Stack Overflow or by Dan
Published on 2010-03-12T18:41:01Z Indexed on 2010/03/12 18:47 UTC
Read the original article Hit count: 208

I'm working on a java project that will allows users to parse multiple files with potentially thousands of lines. The information parsed will be stored in different objects, which then will be added to a collection.

Since the GUI won't require to load ALL these objects at once and keep them in memory, I'm looking for an efficient way to load/unload data from files, so that data is only loaded into the collection when a user requests it.

I'm just evaluation options right now. I've also thought of the case where, after loading a subset of the data into the collection, and presenting it on the GUI, the best way to reload the previously observed data. Re-run the parser/Populate collection/Populate GUI? or probably find a way to keep the collection into memory, or serialize/deserialize the collection itself?

I know that loading/unloading subsets of data can get tricky if some sort of data filtering is performed. Let's say that I filter on ID, so my new subset will contain data from two previous analyzed subsets. This would be no problem is I keep a master copy of the whole data in memory.

I've read that google-collections are good and efficient when handling big amounts of data, and offer methods that simplify lots of things so this might offer an alternative to allow me to keep the collection in memory. This is just general talking. The question on what collection to use is a separate and complex thing.

Do you know what's the general recommendation on this type of task? I'd like to hear what you've done with similar scenarios.

I can provide more specifics if needed.

© Stack Overflow or respective owner

Related posts about java

Related posts about collection