Here at the Blue Cowdawg Perl Coding Ranch not using strict and warnings is considered a hanging offense.
I cannot think of any reason why I would not use them, even in a production environment. When Operations calls me at oh-dark-hundred to tell me that something isn't working the first thing I'm going to do is check logs to see if the script has puked up anything. While strict and warnings are particularly useful for catching compile time issues, warnings in particular can catch other issues. Such as? Let me paint a vignette for you:
Global Operations: The FTP push for Acme Plumbing and Booby Traps LLC has stopped working. The customer is complaining that he hasn't seen reports since ten hours ago.
While that's a bad example of why warnings helped out, the problem was located when the logs were checked and the opendir call failed and was noted. (I don't use die in my production code I have my own function that does the same thing, but logs to a file instead of stdout/stderr) </p.
A better example of how warnings helped would be in a CGI that had multiple hands involved. A module that my code depended on was modified by its author. The return values from that module's methods (subs) had changed and the author neglected to warn anybody. My code (via warnings) started complaining about uninitialized values which were duly logged in the Apache logs.
Peter L. Berghold -- Unix Professional
Peter -at- Berghold -dot- Net; AOL IM redcowdawg Yahoo IM: blue_cowdawg