http://www.perlmonks.org?node_id=206026


in reply to Re: Re^4: Cheap idioms
in thread Cheap idioms

Well, you have a point about the 100,000 files - at first glance. But - if you are processing 100,000 files, is the slurp idiom really the best place to optimize? Don't you think you will be able to get much more than a 1% per-file-open speed up by optimizing some other, more heavily used part of the script? And if you really have optimized all else so well that the file slurp can make any difference - maybe you should be using C, instead of Perl.

Remember - premature optimization is the root of all evil. Don't optimize before you need to. The corrolary of this is that you shouldn't try to micro-optimize parts of the script in anticipation of possible performance bottlenecks. When you do hit one, profile your script and work on the most processing intensive part.

Remember: programs are letters from one programmer to another. The fact that computers can execute them is only incidental.

Makeshifts last the longest.