Don't ask to ask, just ask | |
PerlMonks |
Re^3: Noob could use advice on simplification & optimizationby temporal (Pilgrim) |
on May 04, 2012 at 14:25 UTC ( [id://968927]=note: print w/replies, xml ) | Need Help?? |
When you run a recursive directory search and it encounters some large binary file or something in a hidden away subdirectory you're going to be loading quite a bit into memory. Where it will really get bad is when you get a file larger than your system's memory (or more accurately, the memory allocated to a Perl process). The easiest way to avoid this is to read the files line by line. But you could also write an smart read method which would buffer your reads in a limited length array, giving you something of the best of both worlds. The other advantage to slurping the file is you avoid splitting the file on newlines into an array. You might want to add some filename filtering so the user can exclude/include certain file types. Strange things are afoot at the Circle-K.
In Section
Seekers of Perl Wisdom
|
|