|Welcome to the Monastery|
Okay, I have to ask this. What on Earth is so great about strong typing that I have to have it? Is $x a float? Is it a double? Who the heck cares? (internally in Perl, if it's a number and it's not an integer, then it's a double). Sure, I don't want to stuff a float into an int and lose precision, but in Perl, you'll just automatically get the right type of number when you're done.
The 'C' language has weak typing. Sure, I can type int x;, but nothing later will stop me from trying to stuff a double in there, aside from the possible segfault or my wanting to know why four plus seven equals two hundred and thirteen. All typing in C really does is ensure that you've allocated enough memory (though a good compiler will let you know if you've screwed up if you turn on warnings).
Java, on the other hand, has strong typing. I can assign a float to a double and Java has no problem with that. If, however, I assign a double to a float, Java will fail at compile time unless I explicitly downcast the double to a float, thereby letting Java know that I really wasn't too concerned about precision in that assignment. However, if you had a type system that shields you from worrying about the exact details, is this level of nit-pickyness warranted?
What about Perl? Perl has the catch-all scalar. What's wrong with that? In my opinion, nothing, so long as you realize that there is a trade off and you accept it. That scalar's value can be a string, an integer, a double (yes, there's more, but I'm keeping this simple). Do you really care which? Sure, you need to know if your "value" of PI turns out to be an integer, but does it matter what you have with the following?
Yes, that's a C-style for loop and many people consider them heresy in Perl, but I used a familiar example in case someone wants to show this to someone unfamiliar with Perl.
In the above, admittedly trivial, example, if the purpose of the loop is to ensure that a particular activity occurred exactly 10 times, you probably wouldn't care what type $i is. Well, you would if you used $i != 10 for the exit condition, but you wouldn't do that anyway, right? But actually, what if you needed to? In Perl, a number is an integer if being an integer is enough. Once you add that .01 to the variable, it internally converts to a double and isn't going back. In fact, you wouldn't want it to go back in most instances. Perl knows what you wanted and handles it for you.
The interesting thing about Perl, though, is that it's "primitive typing mechanism" is actually very sophisticated. It only seems primitive on the surface. If you can stomach it -- and be warned, it's a rough ride -- read PerlGuts Illustrated. One of the quotes in there that really struck me was "the internal relationship between the Perl data types is really object oriented ... Perl uses multiple inheritance with [scalars] acting as some kind of virtual base class."
Whoa! What's up with that? And what does it mean to me? Well, what are the benefits of object-oriented programming? We get a clean, easy-to-use API. We get information hiding -- when was the last time you worried about whether or not the scalar had the ROK flag set properly? My favorite, though, is polymorphism. We call the "methods" on our "objects" and we don't worry about which method gets called. For example, in real life, what happens if you add 10 apples and 20 apples? You get 29 apples (I ate one). You don't get compiler warnings or segfaults. You just get a short exercise in how hungry Ovid is. However, try to add 10 and 20 apples with Perl:
Wow. Clearly, the plus operator has some deep magic assigned to it, but once you realize that you're basically making a method call and you don't care what type (hah!) of object you have, then you're doing okay.
Frankly, the more I think about this, the primary benefit of strong typing is generating compiler optimizations -- and if you needed lightning fast speed you wouldn't be using Perl anyway -- and catching stupid programmer mistakes like trying to add a string to a double. We lose the "catch stupid programmer mistakes" with Perl, but by not burdening us with the "benefits" of a strong type system, we can develop so much faster that we find and fix our stupid programmer mistakes before others even write the stupid programmer mistake.
I have never missed Perl's lack of strong typing (but it will be a nice option in Perl6), but the sort of applications that I work on generally don't require it. I'm wondering if any monks can share problems they have had because of Perl's type system. A counter-point would be nice to hear.
Join the Perlmonks Setiathome Group or just click on the the link and check out our stats.