Why would it be? Do software and hardware designs go bad after a while?
If you're worried about the hardware itself going bad, that can happen to new hardware just as easily. Maybe even more so since they are much more complex. Redundancy and fault tolerance are key factors no matter which hardware is used.
"Fixing" it would involve rewritting entire bookshelves of assembler code (since assembler is machine specific). That includes completely redesigning the OS to take advantage of the extra memory and disk space a new machine would provide, and you'd have to do deep structural changes to all the applications to be able to run on this new OS.