http://www.perlmonks.org?node_id=114351

I realize that this is not directly Perl related but I've seen the topic bounce around several times in both the CB and SOPW. I enjoyed reading the following article by Peter Coffee, and thought that others here in the monastery might enjoy it as well. He makes some good points, although I'm sure it applies more to advanced programmers rather than newbies (but then they stand to gain much from the experience if nothing else).

I asked for his permission to include the entire text of his article, since there is no direct URL to it and it is distributed via e-mail newsletter. Here is his response:

Subject: It's not my copyright, but...
Date: Fri, 21 Sep 2001 18:33:10 -0400
From: peter_coffee@ziffdavis.com
To: jlongino@jaguar1.usouthal.edu

If you include the hyperlink for subscribing to the newsletter, I can't imagine my publisher having a problem with your sharing the (attributed) content. Thanks for asking.

- Peter

So here is the link to subscribe, as per his request: eNewsletters from Ziff Davis Media and here is the article, enjoy!:

REUSING CODE IS GOOD; REINVENTION MIGHT BE BETTER

-- By Peter Coffee --

Have you ever heard someone use the phrase, "reinventing the wheel," in a positive sense? In my experience, no one likes the idea: the implication is always either (1) "we didn't know we could have just cloned that" or (2) "these people were either too proud, or too stupid, to use what was already working."

In the case of software development, though, I wonder if there's something to be said for starting over from time to time, instead of falling victim to viral programming: that is, to the rapid spread of the first solution good enough to work at all.

Sorting algorithms are perhaps the canonical example: There are so many of them, and the good ones all trend toward the same limit of O(n log n) performance for a list of n items, but their performance in specific situations (partially ordered input list, for example) can vary greatly. Some require much more memory than others; some lend themselves to parallel-processing environments; some can approach O(n) performance if you know enough about the data going in (see link below).

If you think, "Well, we have a sort routine in the library," you can cripple an application's performance. Some wheels are worth reinventing.

Software defect trend analysis, for example in the reports produced by Reasoning Inc.'s automated source code inspection tools (see link below), has sometimes found odd concentrations of similar errors in long-lived projects. When the history of those errors is traced, it sometimes turns out that copying and pasting from an early source code module has proliferated a subtle conceptual error throughout other code. When code is going to be reused, it needs to be evaluated at least for correctness, but better still for whether it's good enough to reuse instead of reinventing.

Critics of the C and C++ programming languages have been known to apply the "viral" description to their rapid spread and continued popularity: The first compiler written for a new platform, they opine, will be used to write its first decent operating system and will become the lingua franca for its mainstream applications, even if other languages (whose compilers take longer to port) might have yielded higher productivity in writing more reliable code--if only people had been willing to wait a little longer to get started. (See links below.)

The next time someone asks, "Are we reinventing the wheel here?" don't assume that you must prove you are not. "The wheel we have now is square," may be the more appropriate response.

"Experience is not what happens to you. It is what you do with what happens to you." -- Aldous Huxley