tachyon is correct that printing once to each file is much more efficient then printing hundreds of thousands of times. The question is, do you have enough memory to do this? I'd personally store output in a hash, with a filehandle and output buffer for each file, and just print to the filehandle whenever the buffer reached allowable memory / number of files amount of data. Anything still in the buffers at the end of the run would also be printed to the files.

Allowing 10 MB of memory and 100 files, for instance, you could print whenever the buffer for an individual file reached 100K or so.


In reply to Re: Performance Trap - Opening/Closing Files Inside a Loop by TedPride
in thread Performance Trap - Opening/Closing Files Inside a Loop by Limbic~Region

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post; it's "PerlMonks-approved HTML":