|Perl: the Markov chain saw|
This looks like a reaction in part to Aristotle's comments, which was a response to what I wrote in 213131.
I think you are mostly correct. I use a JS front end and a perl back end and its great. As you stated in your reply to that post letting JS do the user friendly form validation is cool.
But the important thing is that you have to degrade gracefully. What this means is that you must make absolutely no assumptions about your user's environment and still be able to present something legible and useful. Furthermore if the user turns out to be a robot you must accept bogus input without barfing. By all means print out a cryptic error message (such as parameter "year" missing or invalid) for complete garbage, but make sure that if the field is reasonably valid (in the example above that parameter year is a 4 digit number) you just accept the thing and return the null result (Fancy web search result with lots of cruft stating: 0 documents were found published in the year 1234).
It is however, IMO, perfectly reasonable to state that if they used a newer browser they'd see something that looked considerably nicer, but unless you have 100% control of your clients (and you don't, no one does*) you must not require that they have certain features! What you shoud do is design for a good experience on some reasonable lowest spec, probably IE 5.0, an adequate experience on something worse and a much nicer one on something better such as Mozilla :)
Enter any 47-digit prime number to continue.