|go ahead... be a heretic|
Performance Trap - Opening/Closing Files Inside a Loopby Limbic~Region (Chancellor)
|on Dec 09, 2004 at 23:57 UTC||Need Help??|
Limbic~Region has asked for the
wisdom of the Perl Monks concerning the following question:
There is a long story behind this that involves a Java programmer asking for some help with Perl. I won't get into the particulars other than to say the question asked was:
What's the easiest way to loop through a comma delimited file and append the line minus 1 column into a new file that is the same name as the excluded column?
The file in question was about 20 lines long. I gave my disclaimer about normally using a module to handle CSV, but the following code should work:
I asked the Java programmer the next day how it worked and I was informed that it was too slow and that a Java program was being written instead. Scratching my head, I asked if the same file I was shown before was the one actually being used. It wasn't - multiple files millions of lines long each. Opening and closing file(s) that many times is bound to be slow. I offered the following modification* of the code provided the column being excluded was fairly repetetive in the file:
I explained that the reason for the disclaimer was that that the hash only bought performance if a file had more than 1 line getting appended to it. Additionally, if there are too many unique files, memory and/or open file descriptors may cause a problem. I was then told that the Java code was nearly done but thanks anyway. *shrug* - exit stage right.
I think I am missing how Java is going to be that much faster. I assume Java is still going to open and close the file each time through the loop unless there is a similar trick. Given that I don't really know Java I could be out in left field here.
Leaving Java aside, is there more run-time efficient way than my second suggestion in Perl? I haven't given it a lot of thought because the Java developer is just being silly. It is a run one and done script so it would already be finished if the first version (wrapped in a tiny shell script) had been allowed to run. On the other hand, this is the sort of thing that I like to be aware of in the future. (Prior Planning Prevents Poor Performance)**
* The actual code used ARGV
Cheers - L~R
** I learned this in the military, but there were a couple extra explicitive Ps