|Perl: the Markov chain saw|
Re: Displaying/buffering huge text filesby BrowserUk (Pope)
|on Feb 23, 2005 at 08:26 UTC||Need Help??|
This builds an index to a million line file in around 20 seconds, and accesses and prints 10,000 lines at random in under 1 second.
If the 20 seconds is too long for startup, then you could always read enough to allow you to populate your listbox, and then push the rest of the index building off into a background thread to finish, while you populate the listbox with the first 1000 lines or so.
The indexing could be sped up by a more intelligent indexer, that reads larger chunks and searched for the newlines, rather than reading one line at a time.
The index requires just 8 MB of ram. That could be halved by using 'N' or 'V' as your pack format, rather than 'd', if your files will never go over 4GB. By using 'd' you are good for files up to around 8,500 Terabytes, which should see you through the next couple of machine changes or so:)
Examine what is said, not who speaks.
Silence betokens consent.
Love the truth but pardon error.