|laziness, impatience, and hubris|
On anything other than one-liners, I never turn off strictures.
I can count the number of times I have had to turn off strictures for production code in the past seventeen years on the thumbs of one hand. (An ugly piece of CGI code where we couldn't trust the database to reliably hand us back a ' ' when it encountered a NULL in a table.)
On the other hand, the number of times I have been pulled into a bug-hunt ("I only changed these two lines and now the code doesn't work. We can't put this back into production with out this bug-fix. And It's gotta be in place before we run the trial-balance at 0030!") only to look at the top of the code and say "put in use strict; use warnings; right here and lets see what falls out." Followed by the discovery of the typo in the fix. You have to love Corporate 'Coding' Standards that require all variable names to be at least eleven characters long and less than 255, CamelCased (with exceptions for certain 'common' words) and inflected....
If I remember correctly, strict is only involved with the Perl compile phase, so you can't take a performance hit there; warnings has both compile- and run-time component. But the few times that I have bench marked codes with and without, the extra run-time was well below the noise threshold.
We had this do-everything-in-a-single-wait/process/wait-loop program that had grown by accretion over five years, and now the performance was unacceptable in a Web server environment. And no, there wasn't time or budget for more that a couple of weeks to work on it; a rewrite was right out. We got the biggest-bang-for-the-buck out of unrolling a pair of for loops and implementing a dispatch table instead of a cascade of if/then/elsif statements. All other sources of 'extra' CPU were swamped by the 7% performance boost we got from the Duff's Device analysis and the 11% boost from the dispatch table. (Y(M|K)MV)