Python: Huge file reading by using linecache Vs normal file access open()

Posted by user335223 on Stack Overflow See other posts from Stack Overflow or by user335223
Published on 2010-05-07T08:45:36Z Indexed on 2010/05/07 8:58 UTC
Read the original article Hit count: 200

Filed under:
|
|
|
|

Hi, I am in a situation where multiple threads reading the same huge file with mutliple file pointers to same file. The file will have atleast 1 million lines. Eachline's length varies from 500 characters to 1500 characters. There won't "write" operations on the file. Each thread will start reading the same file from different lines. Which is the efficient way..? Using the Python's linecache or normal readline() or is there anyother effient way?

© Stack Overflow or respective owner

Related posts about python

Related posts about threads