|Perl: the Markov chain saw|
Re: File reading efficiency and other surly remarksby lhoward (Vicar)
|on Aug 26, 2000 at 05:10 UTC||Need Help??|
Reading a file all at once will always be faster than reading it one line at a time. The problem with the all at once approach is that if you file is large it will consume a large amount of memory by loading the whole file into memory at once. If you want you can get the efficiency of the all at once method without the memory use problem you can use the read/sysread functions to read from the file a block at a time. Only problem with this is that detecting line-breaks isn't handled automatically for you. The code below is taken from an earlier perlmonks discussion about reading files a block at a time; this isn't my code so I can't take credit (or blame) for it.
This example uses a read-block size of 4096 bytes. The optimal value will depend on your OS and filesystem's blocksize (among other things).