|laziness, impatience, and hubris|
OT: Software & Liabilityby cjf (Parson)
|on May 20, 2002 at 17:14 UTC||Need Help??|
Today, computer security is at a crossroads. It's failing, regularly, and with increasingly serious results. I believe it will improve eventually.
The question is - would holding companies liable for faulty software increase the quality of software, and if so, would this gain outweigh the possibilities of reduced innovation in the industry? Consider the following points:
The current lack of incentives to develop quality software
In Phantom Menace: Driving forces behind sloppy code/architecture. vladb posted the following quote from The Big Ball of Mud:
Indeed, one of the reasons that architecture is neglected is that much of it is "under the hood", where nobody can see it. If the system works, and it can be shipped, who cares what it looks like on the inside?
"Works" is a term open to interpretation, but I'm sure the definition would change rather quickly if companies could be found liable for faulty software. Currently there is very little incentive for software companies to do extensive testing on their products. It is far more profitable for them to add new features and get the product to market quickly. As Schneier notes:
The costs of adding good security are significant -- large expenses, reduced functionality, delayed product releases, annoyed users -- while the costs of ignoring security are minor: occasional bad press, and maybe some users switching to competitors' products.
If you add thousands of multi-million dollar lawsuits into the mix, the situation would change rapidly. Would this make the software business far less profitable and slow the rate of progress? Would small software companies be quickly bankrupted by a few lawsuits? Are these necessarily bad things?
The situation in other industries
In Save the Net, Sue a Software Maker David Banisar asks:
Why is software, which is now essential for everyday living, not held to the same standard as cars and children's toys?
He continues with the auto industry comparison saying the current state "is like Ford designing a car that a twelve-year-old can cause to crash by remote control from his garage using paper clips and an old AM radio." Banisar then asks us to imagine having to install our own airbags and seatbelts in our cars. Statistics he quotes also show a correlation between liability and increased safety.
Are these analogies applicable to the software industry? Or is software development so fundamentally different that these do not apply?
Is government intervention necessary?
Very few people actually enjoy waiting extended periods of time for a patch once a vulnerability is discovered in a product they use. Working around non-security-related bugs isn't usually that much more fun. So people should find the right balance of features/bugs on their own right? One needs to look no further than the products of certain large software companies to realize this isn't always the case.
Enter the insurance industry. As Schneier notes, the insurance industry has the potential for enormous influence by adding incentives for companies to use more secure products. If a company could reduce their insurance premiums by 20% by using Apache instead of IIS, they'd most likely consider it. This shift toward higher security products would also be felt by software manufacturers via lower sales of products that the insurance industry deems to be less secure.
The effect on free software
NewsForge recently carried a story entitled Software lemon law with bitter taste. In this article the author points out that under the so-called lemon laws, Open Source projects would face the risk of liability suits. This clearly would not provide developers with the greatest incentive to contribute to Open Source projects. In addition it could be abused by companies with deep pockets to eliminate Open Source software projects that adversely affect their sales.
So, should software manufacturers be found liable for faulty software? And if so, how should the laws be implemented to ensure they don't do more harm than good?