in reply to
Re: Perl Best Practices book: is this one a best practice or a dodgy practice?
in thread Perl Best Practices book: is this one a best practice or a dodgy practice?
The only reason to read the original file line by line would be that it's very large...
Surely not the "only" reason. Here are five other possible reasons, just off the top of my head. You might use a line-by-line approach:
- to allow the application to be used interactively
- to allow the application to be used as part of an unbuffered pipeline
- to minimize memory usage even on small inputs, in an embedded (or otherwise memory-limited) environment
- to restrict the tranformation to transforming individual lines (perhaps for backwards compatibility with a previous utility)
- Because the input data is always runtime-generated (or highly volatile in some other way), so rerunnability doesn't matter
Note that I don't disagree with you that using temporary files is safer in general, if
you can afford the costs involved. After all, using temporary files is precisely what IO::Insitu does. I only question your argument that its possible to deduce the necessity for temporary files from the particular I/O characteristics of the original example (or from any piece of undocumented code for that matter).