laziness, impatience, and hubris | |
PerlMonks |
Re: Splitting big file or merging specific files?by GrandFather (Saint) |
on Jun 30, 2017 at 04:05 UTC ( [id://1193887]=note: print w/replies, xml ) | Need Help?? |
As a general thing only deal with the data you need to deal with immediately. In this case for phase one that means open your input file then while there is more data read a couple of lines and write them to the next output file. For phase 2 that means while there is another file read it and write its contents to your output. Note there isn't a "for" there anywhere. It's all "while something". Let's see how that coould look:
The "slurp" bits set a Perl special variable to ignore line breaks so we can read an entire file in one hit. On modern systems with plenty of memory that works fine for files of hundreds of megabytes so it sould be fine for our toy example. The for 1 .. 2 fetches 2 lines from the input file. If there is an odd number of lines in the input it doesn't matter - we end up concatenating undef to $fileLines which amounts to a no-op so no harm done.
Premature optimization is the root of all job security
In Section
Seekers of Perl Wisdom
|
|