That's untrue. This is just often perpetuated myth.
I've got lots and lots of perls:
miniperl, perl 5.000, perl 5.6, perl 5.8,
thos families of perl differ in way more significant ways then C implementations of different vendors.
And I've encountered some features, described in camel book(s), which just don't work in some (or most, or modern ) versions of perl, for example:
while ($sth=~/sth(sth)/g)
{}
shortcut. This used to work in 5.000, stopped working in 5.6 because of broken
optimisation. I reported the bug, but apparently this was considered a buglet, and left as is.
This forced me, (and for example Apache::ASP project) to instead use unefficient replacement:
$mycopyofstring=$sth;
while ($mycopyofstring=~s/sth(sth)//) {
}
The point here, is that important features of language appear and disappear at will, thus you've got to stay in-loop all the time, keeping and eye on changes, hoping that what you've learned won't be thrown out of the window next sumer
Same goes for your argument about vendors forcing stabilisation - debian folks loved perl and created most of their tools in perl. This backfired badly when some crucial parts of perl changed, thus forcing rewrite of large parts of those tools ( and quite complicated upgrade path - keep both perls on your system, for every package - download new package, remove old package, run it's postrm script with odl perl, run postinst of new package with new perl ).
Also please notice, that there is a very stronge movement
to remove perl from base systems - OpenBSD, NetBSD, FreeBSD,
many Linux distributions, they've already rewrote their perl scripts in sh/ksh, are talking about this or are actively rewriting their perl stuff.
If perl would provide this real basic and stable set of features, nobody would be moving away from it.
|