http://www.perlmonks.org?node_id=206026


in reply to Re: Re^4: Cheap idioms
in thread Cheap idioms

Well, you have a point about the 100,000 files - at first glance. But - if you are processing 100,000 files, is the slurp idiom really the best place to optimize? Don't you think you will be able to get much more than a 1% per-file-open speed up by optimizing some other, more heavily used part of the script? And if you really have optimized all else so well that the file slurp can make any difference - maybe you should be using C, instead of Perl.

Remember - premature optimization is the root of all evil. Don't optimize before you need to. The corrolary of this is that you shouldn't try to micro-optimize parts of the script in anticipation of possible performance bottlenecks. When you do hit one, profile your script and work on the most processing intensive part.

Remember: programs are letters from one programmer to another. The fact that computers can execute them is only incidental.

Makeshifts last the longest.

Replies are listed 'Best First'.
Computers declared extraneous
by rir (Vicar) on Oct 17, 2002 at 15:53 UTC
    ... programs are letters from one programmer to another. The fact that computers can execute them is only
    incidental.

    Incidental!? What bunk!

    Few programs would exist if they did not perform desired functions by being executed.

    The readability of programs enhances their utility as tools and examples for programmers, but is a side issue.
    Side issues can be important, no one buys a car just on the basis of it performing its primary purpose.

      "Always write your code as if the next maintainer will be a homicidal maniac who knows where you live."

      That saying is not bunk at all. A completely unreadable source is of no use to anyone, regardless how optimized it is. What does it do? Do you know? How will you find an error if there is one? How do you know what is an error or not?

      The foremost issue is writing legible code. The fact that you are writing Perl alone, as opposed to C or assembly, means you cannot be extraordinarily concerned with performance to begin with. So if you chose Perl over C, why was that? Now take the next logical conclusion and you'll see that code legibility being more important than optimization logically follows.

      Makeshifts last the longest.

        I agree with the importance of writing clear code. But this isn't the point that I took issue with.

        Your logic is still flawed:

        completely unreadable source is of no use to anyone,

        Users use code that is "completely unreadable" to them all the time. Code with lost source is sometimes used and may be considered "completely unreadable". If that was all that was available it would be used often.

        If every programmer died today, do you seriously think that software would be unused tomorrow?

        What does it do?

        Don't you have to execute it before it does something?

        Do you know?

        You could execute it and find out.

        How will you find an error if there is one?

        You would execute it. If you can not execute it, it has no function.

        How do you know what is an error or not?

        If it is not executable it does nothing. Is that an error? If it executes and performs the functions desired then it is correct.

        The foremost issue is writing legible code.

        The foremost issue is writing code that does something, that executes, that performs a desired function. Legibility is a adjunct toward that end.

        You are like a person who says air isn't that important, you are taking it for granted.

        Your position is ridiculous. If you must cling to it, I hope it serves you well.