This morning, my friend dan and I were discussing our particular tastes in code. We clearly have divergent tastes. I write very methodical, structured, consistent code. Dan, while being a very good programmer, writes code that just irritates me. It is simple and concise (usually, when he isnt writing bit vector stuff), and gets the job done. His feeling is there is beauty in simplicity. I look at it and see something that is inelegant. Sort of like using a shell script when perl could be used.

I'll give you an example. I wrote the following code in perl because it would be prettier, although I could have easily done it in the shell (yes, I am a shell warrior).

#!/usr/bin/perl -w use strict; use Carp; foreach my $file (@ARGV) { next unless $file =~ m/ctl$/; print "archiving $file\n"; system "archivedvd $file\n"; sleep 60; system "verifydvd $file\n"; }
Now, this can be done in the shell, as well:
#!/bin/ksh for file in $*; do echo "archiving $file"; archivedvd $file; sleep 60; verifydvd $file; done
Perhaps this is a bad example because it is too simple to demonstrate why I chose perl. I will elaborate. I chose perl not because it was necessarily the best tool for the job. I did not choose it because there was any particular advantage to it at all. I chose perl because the method is important to me. I chose perl because I think perl is more elegant than using echo, and I like the syntax. Should I ever need to modify the script, perl is far more flexible than the shell. So the script is less efficient than using the shell (ksh uses less ram and processor than a perl interpreter).

Dan, of course, would scoff at this approach. Dep, he'd say, "why the hell would you use perl there? youre just writing a shell script in perl. simplicity is important." Today, he even gave me a document from the Lisp folks (Dan is an MIT alum, and thinks Lisp is cool). I present it here, with my comments below.

Let me say this article has given me horrible indigestion, and I have not been able to get any work done knowing that people like this are out there.

Yeah, I am that irritated.

Call me obsessive compulsive, anal retentive, whatever, but I just cannot get past the idea that there are people who think that it is ever okay to cut a methodological or ritual corner. ever!

I have always found the concept of hacking together something (Larry calls this "Whipuptitude") repulsive. I like a shell or perl oneliner as much as the next guy. But if I use it more than once, I write it out properly, and make an application of it. The above article has this to say:

Excuse me? Consistency of interface is worthless??? Argh! He goes on:

Maybe I should give some background here, but I think most of you know this stuff. I, right now, am auditing a large perl site installation. It has various components, including Javascript, Oracle/plSQL, and Bourne Shell, and it stretches over roughly 6 public webservers and 2 database servers. I come across some really ugly code. It gets the job done. It's even simple. But it is not elegant. It is extraordinarily fragile. Its author is no longer employed here. We dont know why it works because the interface and code is inconsistent. We dont know if when or how it will break.

And it's my job to fix these problems or re-engineer them so that in the future we can just "plug in" the code I have written. Its time/money capital now to save us another prolonged site audit in the future.

I feel like I am fighting a war against lazy programmers who believe that "worse is sometimes better."

Dont get me wrong, laziness is cool. My above example (the shell script vs. perl script example) is a good one. It only runs two commands, and leaves 60 seconds inbetween for the robotics to return. The reason for the script is that I am lazy: I don't want to type that sequence of commands in for each of the 700 dvd's Im going to burn in the next 6 months.

Laziness in code, however, is criminal.

I really want to hear what the rest of you have to say about this. We are here at the Monastery because we share a common ideology. What is that? Do you have pride in your code? Do you think that short, concise, and simple (but hackish, ugly, and inconsistent) code is better than going the extra mile? Spending 80% time on 20% of the code?

We live in a time of gigahertz processors and terabyte raids. Efficiency is important, but not at the expense of an elegant and useful API. And not at the expense of man-hours de-engineering your work. <!- chipmunk, if this offends you like my last rant did, perhaps you should consider changing your ways. -!>

brother dep

Laziness, Impatience, Hubris, and Generosity.

Replies are listed 'Best First'.
Re: The qq{worse is better} approach (discussion)
by clemburg (Curate) on Jul 02, 2001 at 19:27 UTC

    I think you misinterpret the intention of the article you cite if you think it argues for "sloppy code".

    The main point of the article, IMHO, is that it is better to *actually solve a problem partially* than to *try to solve it fully*.

    Trying to "solve a problem fully" usually means "not solving the problem", because the problem gets out of hand - it takes too long to solve it, it is too expensive, you don't have the experience necessary, and so on.

    But "solving a problem partially" does not mean writing sloppy code. It means to consciously exclude part of the problem from the requirements, and documenting that decision, in form of error handling, documented limits of capability, etc.

    Christian Lemburg
    Brainbench MVP for Perl

Re (tilly) 1: The qq{worse is better} approach (discussion)
by tilly (Archbishop) on Jul 03, 2001 at 13:03 UTC
    We are programmers, not priests.

    I don't want to have a programming religion. I don't want to design my programs a certain way because I have been converted to a religion that says I must.

    Please tell me why Dan is wrong. Don't tell me why it does not match your programming religion. Don't tell me that he didn't use the language you wanted to see used and expect me to think that he should have. Don't tell me that he violated a sacred design principle - after all who came up with that design principle and why?

    While you are at it, tell me why Worse is Better is so obviously wrong. Before you do so please read (again if need be) about Extreme Programming and compare their programming strategy with what Worse is Better discusses. Please do not neglect explaining to me why I should perpetuate interface mistakes that I made 6 months ago into the future. And if I don't try changing interfaces from one program to another, then please explain to me how I am ever supposed to get the experience about what works and what doesn't to let me design better interfaces when I start on a new system.

    A final note. Please read what Richard Feynman had to say about Cargo Cult science. The point that is most important there is his commentary on what scientific integrity requires. That is when you have theories about the world, you should not only explain that which fits your theory, but you should pay particular attention to ways in which you could have fooled yourself and things that do not quite fit.

    This applies in particular to your theories about programming. It is a good thing to have theories about how to program well. Without them, how do you know what to aim for while programming? However your theories are probably going to be wrong. Therefore when you encounter things that do not fit your theories, be willing to challenge your theories. For instance if you meet a good programmer who disagrees with you on basic tenants of how to program, this is a sign that there may be something to this programming business that you do not know. That doesn't mean that you are wrong, that the other person is right. But it means that the belief in question is one you should understand, not dismiss or assault.

    Incidentally, based on your descriptions, I am guessing that my personal beliefs on how to program well are substantially closer to Dan's position than yours...

Re: The qq{worse is better} approach (discussion)
by Sherlock (Deacon) on Jul 02, 2001 at 18:43 UTC
    First of all, I'd like to point out that I'm a big fan of simplicity. I've often found myself fighting with others because they want to create a solution that involves so much overkill that they'll spend countless hours developing cool code that will never be used. That drives me nuts. However, in your post, you quoted:

    Completeness-the design must cover as many important situations as is practical. All reasonably expected cases should be covered. Completeness can be sacrificed in favor of any other quality. In fact, completeness must sacrificed whenever implementation simplicity is jeopardized. Consistency can be sacrificed to achieve completeness if simplicity is retained; especially worthless is consistency of interface.

    Frankly, that almost made me gag. In my book, your code should be three things: complete, concise, and consistent (also known as the 3 C's). I think that if you put all three of these elements together, you'll end up with code that is "simple." You may not see this as "elegant," but I think most programmers have very different opinions of exactly what elegant is.

    For example, I was working on a project that had some very complex data structures. I like to encapsulate my data into objects that make sense. On this project however, this left me with level after level of data that I needed to sift through in order to make the object that I was constructing function the way I wanted it to. In that aspect, I'd say that what I was creating was very inelegant. However, the user interface to my class worked beautifully. Because I had put so much work into the back end of the object, the user could access data quickly and easily. Mark one up for simplicity and elegance in that case.

    So, I guess it's hard to say what's elegant and what's not. In my situation two parts of the same object were very different when it came to how elegant I thought they were. (I'm sure others might disagree with me about the level of elegance.) However, I can strongly say that the entire object that I had constructed was complete, concise, and consistent. I don't think you should ever sacrifice any of these, even for one of the others - they should all work together to give you a simple, and hopefully, elegant, piece of code.

    - Sherlock

    Skepticism is the source of knowledge as much as knowledge is the source of skepticism.
Re: The qq{worse is better} approach (discussion)
by Masem (Monsignor) on Jul 02, 2001 at 19:03 UTC
    I think a big stipulation is that the arguments that article tries to convience us is that the code is going to be used by more than one person. Personal scripts, such as the one above, need not be fully qualified since no one else is going to practically see them. Now if in the future you decide that your script is useful to more than just yourself, then I would fully expect that you'd add completeness checks and every little bit of error checking so that the script is secure, fault-tolerent inasmuch as it can be, and as adaptable as you expect it to be before you release it into the wild. But while it sits on your private box, doing nothing else expect when you tell it to, sloppy programming is ok (*).

    In a work situation, however, that becomes unreasonable. Any code that is in use, or that might be seen by other developers, should not be sloppy, and while past workplace rules may have encouraged sloppy code, the new IT workplace encourages code review often, extreme programming, setting fixed implementation specs early, and other methods that allow more than one pair of eyes to see code. The code you've inherieted is crap, and unfortunately there's not much more you can do now to improve that, but what you can do now is to encourage you coworkers to review your code, and to get them to allow you to review their code. Find places where the logic seems questionable, where comments are lacking, or something just doesn't make sense. And this should be done on paper; code review doesn't just mean making sure the code works since most likely the original programmer already did this. Sloopy programming tends to lead to code that doesn't make sense, while code that does make sense tends to be clean. But again, you've explained your workplace woes, and it sounds like you need to invest in a major clue hammer to get anything working.

    Now mind you, there are points in the workplace or non-private code where a quick sloopy program will suffice, assuming that it's only a temporary solution or to test other parts of neat, complete code. For example, network server program A (your major code base) sends output, and you want to verify that output, so you write a dinky little program B that connects to A and sucks down whatever A sends to STDOUT. The code for B is probably straightforward and simple enough that it needs no real comments (much like your code snippet above), and unless other programmers really need it, can stand alone as an incomplete program. Even if others do need it, you can probably add just enough comments to point people to what the program does and the like so that they won't be confused by it. Of course, if someone says that B should become a full-fledged module, then I would expect it to be fleshed out much more.

    (*) Note that sloppy programming doesn't necessarily mean bad programming.

    Dr. Michael K. Neylon - || "You've left the lens cap of your mind on again, Pinky" - The Brain
Re: The qq{worse is better} approach (discussion)
by tachyon (Chancellor) on Jul 02, 2001 at 19:24 UTC

    I think it is important to remember that selecting the right tool for the job is an important part of being a good programmer. I think everyone would agree you wouldn't use C++ for your example, however it doesn't make a case for using Perl over a shell script either. Accoring to Damian Conway there is a linear relationship between the length of a piece of code and the munber of bugs it will have and that shell script is shorter...

    Like you I like to write nice clean self contained elegant efficient reusable code which I hoard in my personal snippets chest. Most of this stuff is junk that just litters the hard drive but sometime these things come in handy again and I am glad I wrote them well in the first place. Sadly as time passes my definition of good code changes in the light of increased experience. Some of my old code is just horrible. But I loved it at the time.

    Be careful you don't fall into the Perl for everything trap like I did. It's a disease you know, with no know cure.



    PS Suppose you won't like my sig ;-)


      Accoring to Damian Conway there is a linear relationship between the length of a piece of code and the munber of bugs it will have and that shell script is shorter...

      A better reference for this can be found in Code Complete, chapter 21, "How Program Size Affects Construction", page 523, from a study done in 1977 (Jones, T. Capers. 1977. Program Quality and Programmer Productivity. IBM Technical Report TR 02.764, January, 42-78.).

      In short:

      Project Size in Lines of CodeError Density in Errors per 1K lines of code
      < 2K0 - 25
      2K - 16K0 - 40
      16K - 64K0.5 - 50
      64K - 512K2 - 70
      > 512K4 - 100

      Page 610 (in Chapter 25, "Unit Testing") cites "15 to 50 errors per 1000 lines of code for delivered software" as "industry average experience".

      Note however that this data is very old, and new methods for software construction may give better results. Also, some organizations, most notably NASA projects, have achieved much, much better error rates, mostly by introducing rigorous testing and code review schemes. Also note the enormous variance in the data above.

      Christian Lemburg
      Brainbench MVP for Perl

Re: The qq{worse is better} approach (discussion)
by tadman (Prior) on Jul 03, 2001 at 01:14 UTC
    The spirit of the "Worse is Better" argument is generally not about implementation, but about design. Implementation is an artifact of design, it is a logical consequence. An implementation should be true to the design, and it should be competent. Nobody is advocating an implementation that does not work, or is a "hack" of poor quality.

    In the context of WIB, "worse" strictly means "not as good", and is certainly not intended to mean "bad". Fundamentally, a case is being made that the simple, "not as good" solution is frequently better than the "better" solution. This is not unlike the old adage that "less is more". The number one principle of WIB is simplicity of design, a.k.a. "Keep It Simple Stupid", or "K.I.S.S.".

    Simplicity of Design
    So many times, some truly brilliant people have over-thought, over-designed, and over-engineered something to the nn-th degree. The end product is an impressive sight, something so awesome that it could probably solve the riddle of the universe. If only you knew how to use it, that is. These works of "genius" are often inscruitable to anyone but the author and their close friends. As a result, nobody uses them because they are useless, or it will be used improperly and will erroneously be perceived as being useless.

    This is why simplicity of design is vital. If the design is too complicated to be understood easily, then no matter how important or powerful the product is, it will not be used effectively.

    Elegance is a way of taking something complex and making it very easy to use. It is about hiding the scary, complex things, and making them disappear. Then people wonder at their own skill, forgetting entirely the underlying power of their tool. In a single line of Perl you use thousands of lines of C code written by many individuals that are quite possibly far more skilled than you are, yet you aren't forced to learn more than is necessary to use it. The design humbles the program, making the power useful, it does not force you to raise yourself to the level of the program. You do not need a PhD to use Perl.

    Avoid The "Golden Hammer"
    The "Golden Hammer" is a program that is far too good for the task at hand. It is finely crafted, robust, elegant and powerful. It is everything that you could ever want in a hammer, and so much more. Yet a regular hammer will do the job just as well.

    The "Golden Hammer" is not to be confused with the "Swiss Army Knife", as a Golden Hammer can only hammer, although it does it extraordinarily well. A Golden Hammer, put in a physical perspective, would have four wheel drive, bucket seats, air conditioning, a surround sound stereo, window wipers, traction control, ABS, and a really big hammer on the front that could nail anything in a single stroke with a precision of one-millionth of an inch. Unfortunately, it also took twenty to five hundred times longer to create than a regular hammer and takes three weeks of intensive training to operate safely.

    WIB absolutely does not advocate using a rock, a screwdriver, or a piece of wood when a hammer is required. It simply advocates using a well designed, quality hammer that is easy to use and is understandable.

    The interface should be simple, and sometimes simplicity introduces inconsistency. For instance, using a hammer's "claw" feature requires a change of grip from the normal "bash" feature. A "better" hammer could allow you to do both things from a single position, but would this truly improve the utility of the hammer? Would it make more sense, or would it just be showing off your engineering prowess?

    Good Enough but Not Too Good
    Simply put, make your program functional enough to get the job done well, simple enough to be understandable by others, and correct enough that it works under a wide range of circumstances.

    It is better to have two problems solved than one problem over-solved.
Re: The qq{worse is better} approach (discussion)
by runrig (Abbot) on Jul 02, 2001 at 20:10 UTC
    In your example, there's no reason to use perl over ksh. There is one extra step you have in the perl that's not in the ksh version, but it could be added with no problem (oh yeah, and ksh has a 'print' statement also):
    [[ $file = *ctl ]] || continue
    If you were doing lots of other things, like FTP'ing files and wanting to make decisions based on the status of every step (you're not even checking the return status of your system calls), then I might make an argument for perl. In this case it just doesn't matter.
Re: The qq{worse is better} approach (discussion)
by scott (Chaplain) on Jul 04, 2001 at 02:18 UTC

    I don't think I can improve on the philosophical answers already given but I can offer somewhat of a more material example of something like the `golden hammer'.

    Check out this (700kB, sorry) image.

    This is a little something we've been working on here in our lab. We're in the process of a large upgrade and it's still only partially reassembled so please excuse the disorder.

    In NSF-please-give-us-cash puffery speech, this is an Ultra High Vacuum Scanning Tunnelling Microscope/Scanning Force Microscope (UHV STM/SFM) with an attached Low Energy Electron Diffraction (LEED) system, a molecular monolayer evaporation chamber, and ancillary stuff -- well, all right, `stuff' isn't a word usually used in NSF grant proposals ...

    It can `see' atoms by, among other methods, holding a sharp tip one hundred billionth of a meter above a sample and measuring currents flowing between the tip and sample of a few hundred trillionths of an Amp. It does this after removing all but one trillionth of the atmosphere from the chamber.

    The shiney bits are stainless steel. It cost about $250,000.00.

    So much for the impressive-sounding hype. The point is that it's a very sophisticated, complex, subtle, and delicate device. Something that will be used heavily for 10-15 years to produce a high percentage of all the data that will come out of this lab during that time. About five people will probably get their PhD from it -- not me, unfortunately, I just get to do most of the construction. :(

    Now look closely. You'll see that it's kept off the floor by stacks of concrete blocks.

    The white squarish bit in the middle on top (a little to the right of the `smoke stack') is a block of one-inch plywood, held on by two bungee cords.

    One of the bits of culture of our lab is the phrase `you have to know when to go fast and when to go slow'.

    The plywood is a base on which to support a $5,000 temperature probe. There are no plans to replace the plywood with a custom machined base with levelers and a clever way to secure the probe.

    There are no plans to replace the concrete blocks with a sophisticated, computer controlled, dynamic damping vibration isolation table. The bricks (each with a single issue of the local school newspaper between them) do their job just fine. As does the slab of plywood.

    I could have spent weeks designing the temperature probe base but there was no reason to. I could have (easily) spent $25,000 (of my boss' money, of course! :) ) on vibration isolation but there's nothing wrong with our solution, even if it is the `DOS batch file' of shell scripts.

    While it is often satisfying to `do it right' -- I removed a set of bolts last week because I didn't like the fact that they were the only ones with 12-sided heads on the system -- it's really `doing it' that's important.

    All the is gold does not glitter,
    Not all those who wander are lost ...
    J. R. R. Tolkien