|Do you know where your variables are?|
If you are interested in technology and it's continuing role in politics and society as a whole, I highly reccomend Declan McCullagh's mailing list, Politech. Content is submitted and read by people ranging from lowly developers like me to some of the best minds in technology and politics from around the world, though that is so few and far between in combination. *grin*
"-The $165 million Mars Polar Lander probe was destroyed in its final
If I remember correctly, this wasn't, technically, a software error - it was an architecture error. In this case, the software/firmware did exactly what it was designed to do; it was the two design teams that did not communicate correctly on a very important issue - Units of Measure (Which is why all good high school physics teachers pound units into your head).
One piece of software was calculating with one method and passing its results off to another, which was calculating assuming a different units type. I know it's a fine line, but it's not like a memory leak or similar 'bad programming practice', the software was doing exactly as expected as designed.
Whatever. Talk is cheap.
What I didn't read in this article (as with the hundreds of others before) is how to implement an accountability structure that doesn't demolish innovation and put the fear of having a free thought into the head of every developer under the sun (or Sun, for that matter). Do we create a warranty system for the users? How is it implemented? Can we sue the software company? The implementer? The guy that wrote the code under a strict and irresponsible deadline that was set by management who usually has little or no development experience and will accept pressure from 'customers' to deliver over pressure from their own staff regarding quality, design and safety?
Personally, I think that the system is calming down, and the noise is starting to filter out. I think that like the automobile industry, manufacturers are beginning to set up their own recalls (patches, versioning, etc) in a similar fashion. You can get extended service plans with most enterprise level software (and most desk top level ones as well).
However, what you don't see is a public that is appreciative or even vaguely knowledgeable regarding the effort that goes into the magic of "You've got Mail! (tm)"
They pop the hood on their car and are intimidated by the 'new' technologies of drive by wire, fuel injection, ABS, auto traction control, etc, and are forgiving of a product life cycle that only sees updates once a year AT THE MOST because they have something physical that they can look at and say "Yup, I see, it, I don't get it, must be tough to do". Some platforms undergo changes at such subtle rates that the only difference to the user is a slight packaging change.
Also keep in mind that aside from the add on electronics that mean nothing to the actual function of the automobile (We call them "Bells and Whistles" in the software industry - like A/C, an audio/video system, turn indicators, interior lighting), the interface to the automobile is two (maybe three) pedals, a wheel and a shift unit (either manual or automatic).
But software... now, software is magic.
"Any sufficiently advanced technology is indistinguishable from magic" (Clarke's Third Law).
It makes writing a letter simple, changing and deleting content a snap. Formatting and printing, changing font styles, size, COLORS...Cake. And putting it in an envelope is reserved for only specific correspondences, the rest, email.
I don't know what it is about software, but people want their new release almost as soon as the CD/DVD begins whirring on the install of their current product. They'll drive a 57 Chevy with pride and spend thousands to get it running, but they'll be damned if their latest release of Quake renders blood-splatters across their 21" flat screen monitor without anything less than life-like color and accuracy.
It has to be better, faster, with a better GUI feature set (did I mention three pedals and a wheel yet?), more robust memory management (there's only 2 gig addressable, and I have 37 other applications running, you know), and I want it before your current projected release date. If I'm on the beta list, I'll get upset if it crashes my machine, and if I try to run it on old hardware, I'll expect _you_ to figure out why and how to fix the compatibility issues I created because I clamored for the new software, but am unwilling to upgrade my hardware to meet the needs of the 'feature, content rich' interface you required.
A good architect will know (because it's what we're taught) that there should be three ways to get to any function in software. Three. (How many gas pedals are there? Steering wheels?). Because no two people think about function the same. In honestly, it's that we haven't tried to come up with standards that deal with expectation and change management, so instead of creating a paradigm, we allow the user to dictate how we develop.
It's definitely a give and take method to implementing magic.
Granted, in the auto industry, there are focus groups and Q&A's, driver feedback, warranty and quality metrics, blah, blah, blah...but in the end, all of that goes to enhancing the 'Bells and Whistles'. Because we still use three pedals and a wheel.
When Neumann's group worked with NASA on software for the
If only the adhesives group was as meticulous.
I'll argue that not reusing strong code is the root of many of our issues today. This lack of reuse, brought on by an industry climate rife with IP mongers (lawyers) looking for the faintest breath of copyright infringement has kept the idea of a code or best practices data store out of the mainstream, thus creating a continuous (sometimes erroneous) re-solving of similar problems throughout the industry.
Look at a strong community of practice site like perlmonks.org. With environments like 'Snippets', 'Catacombs' and 'Tutorials' (not to mention the primary function of asking a perl question and getting many instructional perl answers), you begin to see good code packages being designed, broken and reborn as functional units which then are many times promoted (in whole or in part) to package status on CPAN or similar archives.
This type of practice GREATLY increases the robustness and functionality of code. By exponentially extending the number of people that try to implement a code base in different ways throughout its development lifecycle (thus, testing), you start to ferret out problems that the designers may have overlooked. This peeling away of layers happens at a much faster rate, and although slower than the "Throw a bunch of coders at a. - release, and b. - bug management" it is faster than three lines a day (most times). It also creates a stronger planning cycle. I don't know how many projects I have been shouldered with where the planning was thought of as a secondary need, rather than a primary focus. The observation is usually, "If I wasn't in my seat coding, I wasn't being productive". How do you explain that two extra weeks of planning now could save us four weeks of change and bug chasing on the back end?
To someone who only wants to see results (and I guess even negative results are results), you can't.
"Yet computer code could be a lot more reliable _ if only the industry were more willing to make it so, experts say. And many believe it would help if software makers were held accountable for sloppy programming."
I'd like to fine tune that a bit and say:
"Yet computer code would more reliable if only the consumers were more willing to let it be so, and it would help if software coders were allowed to do anything beyond sloppy programming."
It's change and expectation management, a lack of proper development methodologies and good managers to help implement them, a product complexity so much beyond anything that the consumer has ever before consumed on platforms changing faster than code can be written to optimize on them, etc, etc, etc.
"If software makers haven't done the best job, consumers are hardly blameless. We have long favored flashy products over reliable ones.
As a developer, I'd personally like to point the finger at the consumer, but not only am I at their mercy I am one of them as well, so I cannot. It's Ouroboros, a snake eating its own tail. We, as a consuming public will not slow, so we, as a developer cadre, must do our best to stay in step with the speed and breadth of advancement. And in most instances, quality will suffer.
Kaner has the right of it, as a first step, I think. We should open our doors, and reveal our faults. As an open source advocate, obviously I feel that we should give access to the code as well, but not all need agree with that for this first step. Just agree that we should TELL the consumers when there is a potential of their gas tank exploding in the event of a rear end collision, or that they are at risk of revealing sensitive corporate information if a certain combination of products is used.
What do you think? Obviously there need to be checks and blances in place to protect the integrity of data and the safety of people that rely on software, but how do we, as developers, help to find and implement them?
I think it is time for us to start insisting on Q&A and development life cycles that mean something. I'm NOT suggesting some kind of QS9001 for code, heavens no, but can we begin to raise the integrity of what we do by driving the quality of our product vs. the fastest time to market? Can we begin to help the consuming public with expectation management and change management?