http://www.perlmonks.org?node_id=571830


in reply to Re^2: Projects where people can die
in thread Projects where people can die

Are you going to guarantee the absence of the effects of cosmic rays / radiation / jam on your processor?

Well, there are such things as space-rated cpus, and in any environment/application where radiation is a hazard, they would be used along with secondary protection (lead or gold shielding)--but hardware has moving parts; is subject to wear and tear and tolorances. Hardware fails. Disks fail. Even high quality brand new disks fail. Of course, you can run extensive tests to reduce the likelyhood of some failure modes, but in doing so you run the risk of increasing the likelyhood of others--through wear and tear.

In anycase, there are no guarentees.

It's all about likelyhood, and the most vulnerable component in most computer systems is the harddisk. That's why solid state secondary storage is such a holy grail. Removing that from the equation just makes sense.

With no guarentees, it's all about minimising risk. And that's about spending your money to achieve the biggest bang for you buck. Of the millions of computer users around the world, it's probable that 5 or 10% have experienced some form of disk failure. I have.

How many have experienced cpu failure--of any kind? Of those that have, how many could be attributed to some form of radiation degeneration of the cpu (or memory)? Much harder to access as without extreme analysis, there is simply no way to know.

The point is that it is possible to test Perl code as thouroughly as any other code, but the additional step of repetitive runtime compilation is one further possibility of failure, For non-life critical systems, the additional risk is (in most cases), not worth the cost of elimination. But for life-critical systems, it is not worth the risk not to.

The safer way to provide security is to have multiple redundant, different (many people miss this distinction) systems checking each others' results.

I'm cognisant of the technique.

Applied to a Perl program, this would entail producing a completely separate implementation of perl. Since there are no specs--the existing sources are the spec--there is nothing against which to build such a system, let alone verify it.

I have a memory of reading an artical--possibly related to the fly-by-wire systems on Eurobus aircraft--that suggested that using a single set of sources compiled by different compilers and targetted at different cpus was better than producing two set of sources in different languages. I can't find references. From memory, the rational went that by starting with a single set of sources, it reduced the complexity, by removing the need to try and prove that two language implementations were equivalent. That somewhat unintuative conclusion actually makes economic sense. Every reduction in complexity comes with an increase in the possibility of proof. Maybe :)


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.