Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling

Re: Random quotes in the top left corner

by mr_mischief (Monsignor)
on Apr 26, 2005 at 16:23 UTC ( #451658=note: print w/replies, xml ) Need Help??

in reply to Random quotes in the top left corner

The KISS principle has been around longer than Perl. It's a great thing for any programmer to remember, or any engineer, any web page designer, any cabinet maker, any essayist...

The whole point of Keep It Simple, Stupid is that many people who design and implement new things are in the bright to genius category, including very bright somewhere in between. It's a matter of pride and skill to be able to make things very complicated and still understand them. However, there are a several reasons not to make things complicated.

  1. Complicated things are harder for an outsider to understand and fix later.
  2. Complicated things are generally less robust and break easier.
  3. People working with complicated things tend to lose sight of the original intended purpose of that item, and all kinds of bells and whistles get added at the expense of the main functions.
  4. No matter how smart you are, the jobs of programmer, engineer, essayist, cabinet maker, et cetera are more or less about managing complexity in the first place. At some point of introducing additional complexity, even the smartest people can't hold enough of a project in their heads at once. Paragraphs, chapters, subroutines, modules, reusable parts, and templates (both the material working and programming kinds) are tools to help localize complexity -- that is, they raise the complexity a manageable amount in several places so the overal complexity can be reduced.

Almost anyone can screw together premade cabinets that have been disassembled if they have the directions and enough patience. It's the cabinet maker who takes a bunch of lumber, planes the cabinets, makes the right cuts, puts the wood on the router table with the right jigs, and sands the rough spots out. This is managing complexity by taking a bunch of raw materials and making parts that connect, then conencting the parts. Having premade screws in a package and glue in a bottle of course are helpful. We as programmers are lucky that often many of our parts are already made.

Almost anyone, likewise, can figure out a Hello World program if given a reference manual on a language. Fewer people can reason about a 100-line program with no formal flow control. Even fewer can reason about a 1000-line program with no formal flow control. Most people can understand a program with ten subroutines doing ten well-defined things each ten lines long. If those ten each call ten more well-defined subs, that's not much of a stretch either. Three 40-line subroutines of course can be as reasonable as ten ten-line ones, depending on the problem you're trying to solve. Loops, subroutines, modules, blocks, and a certain level of syntactic sugar are all helpful precisely because they let one focus on the steps to solve a problem at one moment and how to break down an individual step into substeps at another time.

There are of course many other tools to help make things less complex, but the best tool to manage complexity is a well-trained and/or insightful mind. Knowing where you can eliminate complexity and where it's safe to have a little more than usual takes training, practice, or a really good eye -- preferably all three.

It's probably the art and science of managing complexity that accounts for most of the huge differences among lines of working code per day produced by programmers using the same language. These differences are not usually measured in numbers of individual lines, but in multiples or even orders of magnitude.

Christopher E. Stith
  • Comment on Re: Random quotes in the top left corner

Replies are listed 'Best First'.
Re^2: Random quotes in the top left corner
by willyyam (Priest) on Apr 27, 2005 at 12:59 UTC

    2. Complicated things are generally less robust and break easier.
    An interesting counter-example is an ecosystem - more components and interactions equal more robustness.

      That is a great counterexample. Note two things about that, too:

      The complexity in an ecosystem comes from nature and not from man. The complexity of an ecosystem is such an obstacle to our full understanding of it that only in the last few decades have we started to understand the damage done to the systems via the damage done to their parts.

      So, it's a great counterexample to one rule, and a great reinforcing example to another.

      Christopher E. Stith
        Furthermore from the ecosystem example, the simplest organisms are the ones which survive the longest. Cockroaches, one celled organisms. In nature as well, things break down. Complexity invites this :) Simple things are beautiful, elegant, rare...
        The KISS principle could still be said to hold for biological systems, and for physical systems in a broader sense.

        Inorganic and organic components are in constant flux, --aggregate/grow, break/mutate, fuse/recombine, decompose/die, recompose/create over and over. Over time the stable constellations and processes dominate, and the fragile break and gets recycleced.

        To some degree it is the same with software. One of the most succesful ecosystems was and is Unix, to a large degree because of its clean design principles -- its kernel and shells structure and its proces & IPC model (I/O redirect, pipes, filters). The unix design is KISS, simple building blocks and interfaces. (and that shows in Perl too)

        The periodic system and DNA are relatively simple component toolkits too, when we look at the interfaces. The way to combine these bricks though are virtually endless, and even though the basic processes of (chemical and biological) attraction/repulsion, combination/selection are are simple too, the resulting combinatorial universe and thus complex systems are mind blowing, -- given enough time to evolve. From a cloud of Hydrogen to Mozart.

        We don't build software that way, yet. It will probably require the next generation of massively parallel computers combined with facilities to automatic program mutation (refactoring) and selection. And to master that kind of complexity will require an even stronger KISS focus on interfaces and basic processes than today.
        -- allan

        As the eternal tranquility of Truth reveals itself to us, this very place is the Land of Lotuses

        -- Hakuin Ekaku Zenji
      An interesting counter-example is an ecosystem - more components and interactions equal more robustness.

      What's so robust about an "eco-system"? What functional elements of the "eco-system" are you claiming this robustness for?

      If I've very carefully selectively bred, say, an ant colony to create tunnels in patterns that represent the solution set for a given computation, I've got a very fragile system, not a very robust one. I'd need massive amounts of parallelism to match the correctness of even a small microcomputer, in order to statistically correct for all the flaws in the individual ants.

      Ecosystems are only robust in that it's reasonably hard to competely disrupt all biological processes in a given area, due to sheer numbers. But then again, it's even harder to destroy all geological processes, let alone radiation processes, due to an even bigger problem of scale.

      And even when we examine ecosystems, we find that the small,simple organisms: like, bacteria, grass, and insects often tend to outlast the big, complicated ones (dodos, dinosaurs, and sabre tooth tigers).

      K.I.S.S. is a good principle.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://451658]
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others rifling through the Monastery: (5)
As of 2018-03-21 02:05 GMT
Find Nodes?
    Voting Booth?
    When I think of a mole I think of:

    Results (263 votes). Check out past polls.