|Syntactic Confectionery Delight|
Fun indeed! Interesting philosophy. Undefined but consistent is oxymoronic, if its consistent (not just constrained) then its defined. The subtlety is that we _really_ mean not defined across platforms and implementations. The platform, processor architecture, C compiler used and a host of other things add variables (that may as well be random) into the equation that defines the statements behaviour.
While this is a semantic argument, it is not as straightforward as you might think. The term 'undefined' in comp-sci lingo comes from what electrical engineers designing digital circuits call 'undefined', the so called "don't cares" in a truth table. It's not that these "don't cares" will not take on a value if the circuit happens to somehow land in that state -- it's just that the designer doesn't care, and will not gurantee at all the value of that state.
So when you're reading about a language and it says "undefined", it is with a capital 'U' and can be read as "not guaranteed", as in "this is the part of the contract of the API".
The designers of said language would probably slap you upside the head if you said "but it's consistent, therefore it's defined", mostly because in their lingo the indeterminate variables that you yourself mention are all taken seriously when incorporated under the umbrella of a strict definition such as "undefined". And you are correct that, on the hardware level, the influences can be random. These are generally avoided via the use of guard circuits -- see here for a discussion regarding guard clauses.
In reply to Re^2: Quantum Weirdness and the Increment Operator