tachyon wrote, "What I don't really understand is why you could not make time_t an unsigned 4 byte int and thus get another 68 odd years out of it. Leaving aside the problems with C code that expects it to be signed. Why was it unsigned in the first place anyway?"
How would you denote my birthdate in 1967, if you use an unsigned forward-pointing integer measured from the epoch of 1970?
Java standardized on a 64-bit signed time in milliseconds from the same 1970 epoch, which gives them enough range to describe any moment from big bang to big whimper, remain coherent with code that thinks that 1970 was a pretty cool year, and still be a thousand times more precise than the standard unix time. I don't expect time() to follow that standard, but I do think it's safe to say, even considering the usual famous misquote, that 64 bits should be enough for anyone.
e d @ h a l l e y . c c