|Perl: the Markov chain saw|
Re: help reading from large file neededby BrowserUk (Pope)
|on Oct 12, 2010 at 21:29 UTC||Need Help??|
Are there other techniques I should use to traverse a large file like this and which might offer methods to move forward, back, go to beginning, etc.?
See seek. It works best if the records are fixed length. If they are not then creating an index that maps record number to file position is very simple and makes for quite fast access.
I have a file that is a 3.6GB and contains 40e6 records. I index it like this:
Which takes just a couple of minutes to run. I can then randomly access the records in that file using:
At 0.2 milliseconds per record, it is fast enough for most purposes.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.