Please send the following response back.
The problem with most of those arguments is not what they address, but what they fail to address.
I would say the same about your response. For instance a key theme in the responses which you just ignored is that Perl allows programmers to be more productive. There is a reason for that. Java decided that it is more important to allow 10x the number of programmers to work together. Perl tries to make programmers 10x as productive. The Java approach is great if you want to sell lots of seats, or if you want to build a great empire under you. The Perl approach is great if you need to actually get stuff done.
- Java is faster than Perl in a multi user environment.
Not according to my benchmarks. YMMV. And certainly will if you compare a particularly slow Perl environment (eg CGI or AxKit) versus a good JSP platform (like Resin). Ditto in reverse if you compare Tomcat version 3 to real mod_perl handlers.
Otherwise they are reasonably close. Which brings it down to the programmers. For instance if your Java programmers need to build up a string incrementally and make the mistake of using String and concatenation, Java will run like a dog. (Instead they should use StringBuffer and .append() to it.) Now theoretically the same mistakes can be made in Perl. But from what I have seen, people don't seem to do it nearly as often. Why not? Because the default way of doing it in Perl usually is pretty efficient, Perl has fewer APIs to wade through so you have less to keep in mind about potential gotchas, and you write less Perl code so you have less room to accidentally slip up.
- Java is less resource intensive than Perl in a multi user environment
I could respond to this as I did to the previous one, but I won't. Instead I will introduce you to my friend Moore, and suggest that the salary of a programmer you no longer need due to improved productivity can pay for a lot of upgrades.
Besides why the fixation on resources in a multi-user environment? Most computers out there seem to be massively overpowered desktops with one user. Most servers seem to be department servers with massively greater capacity than they need. If I go to a friend who routinely throws around terabytes of data, what does he do? Oh right. He buys dedicated boxes.
Massively multi-user servers seems to be a pretty specialized market. So why fixate on it?
- Java threads like nobody’s business… which makes it ideal for enterprise system programming (Utilizing the capabilities of a big MP box doesn’t require special code.)
And heavy threading results in lots of potential for race conditions. Which Java admittedly mostly, but not completely, managed to avoid for years by only offering blocking APIs so that people would be encouraged into a model of always spawning a thread for I/O.
Of course eventually that overhead became bothersome enough that the maze of "almost the same but slightly different" APIs were expanded with another (incompatible with the old of course) API for non-blocking I/O. Which finally allows Java to get more scalable I/O, but also leaves it with the worst of both worlds.
Incidentally a historical note that you probably don't have the perspective to appreciate. When Sun originally came out with Java and trumpeted it as cross-platform, the leading Unix platform was HP-UX. HP-UX only had kernel threads, and started to choke at around 50-60 of them. By contrast Solaris had very good multi-threading support, but since the standard Unix model was fork/exec, nobody used it. Strangely not long after Java came out hyped to the gills, people began writing cross-platform benchmarks in Java. Guess who lost those badly?
Coincidence? You decide.
Pervasive multi-threading all of the time is not always good just because Sun tells you that it is. Nor is it necessarily true that you want a big MP box. Spec out a 4 CPU box versus a cluster of 8 1 CPU boxes some time. Which is a better price/performance point? If your problem is at all parallelizable, the latter...
- Java’s code re-use abilities are infinitely better than Perl
Then why is the largest freely available collection of reusable libraries is CPAN? And in my experience the quality is much better than most proprietary libraries. (For one thing there is not the proliferation of half thought through but broken APIs that you see from some other places that I might mention.)
- Java is a pure OO language (even more so than C++)… which means you have inheritance, encapsulation, and polymorphism.
You may be pardoned for not knowing better if the only thing that you can compare Java to is C++. But Java is not pure OO. The commonly given example is that integers are not real objects. Since that one has been beaten to death with great pleading by Java proponents for the special exceptions built in to the language, I will give an unusual one. In Java you can choose to have class methods. But those methods aren't anything other than functions with a long name. You can't get do anything polymorphic in them, such as get hold of an object representing the class. Hence the need for Java Enterprise Beans to create "Home" objects for what should be built in to the language.
If you want to learn what a pure OO language is like, then I would suggest learning Smalltalk or Ruby. There everything (yes, including integers) is an object. Be warned that I know many people who have extensive experience in both Java and Smalltalk. But not one prefers Java to Smalltalk. This transition may be uncomfortable for your world view, it is definitely safer to hold your hands over your ears and chant Sun's marketing slogans.
- Java’s security features are the best of any language… bar none. Each member of each class can have one of four security levels.
Complexity does not equal quality. Java's ACL model is not necessarily good just because it is there, nor is your software necessarily secure just because your language has the feature listed on a checklist. (Even if many PHBs mistakenly think otherwise - and then wonder how someone managed to nick their credit card database.)
Personally in a complex application I far prefer using a capability security model. And all that you need for that is a good object model! Better real security, less work. (To understand the fundamental problems with an ACL model like the one that Java uses I would suggest a search for the Confused Deputy problem.)
- Java incorporates Java Swing, a highly advanced GUI toolkit.
Apparently everything that Java does has to be good in your world because it is in Java. In more realistic assessments even friends who like Java admit that Swing is not very fun to use in practice, even when the performance and bugs aren't a problem for you.
- Java 2 Enterprise Edition has exceptionally powerful messaging and event control capabilities.
You are in a maze of twisty little APIs, all alike...
One of the worst characteristics of Java is the tendancy to introduce new APIs rather than fixing the bugs in the old ones. J2EE is a continuation of this source of unproductivity.
- Networking in Java is as easy as creating a socket object and piping it through a buffered reader or writer…. That’s it.
Unless you turn out to need unbuffered access. Or non-blocking. Then you have to use a different set of unrelated APIs that use incompatible objects. And let's not get into how much code you have to write to actually do anything with the network.
- JDBC allows very rich RDBMS interaction…. Far richer than Perl’s SQL module…. Not to mention connection pooling.
With DBI you can do anything that you can with Java. (Yes, including connection pooling.) And often far more easily. Also do not forget to look at the various wrapper modules to simplify things.
There was a bit of misinformation there. SunONE Application Server is free on Sun hardware… and sun hardware is the biggest bang for the buck for web app environments. Also, Apache’s Tomcat engine is free.
At a guess I have been around the block a few more times than you have. So I am going to give you the best advice that I know.
Sun is toast. Oh, they are alive, kicking and powerful now. But their days are numbered and it would not be wise to make long-term plans on them.
If you want to understand the common industry pattern which dooms them, I can highly recommend The Innovator's Dilemma by Clayton Christensen. Here is a brief synopsis of the pattern:
You have an established industry with established companies and existing customer bases. A new innovation comes along which cannot do what customers need, but conceivably will in a few years. We call these disruptive innovations for reasons that will become apparent. The established players frequently evaluate it (heck they often invent it), and find that they can't use it.
However small startups (or sometimes spinoffs from larger companies) take the technology and try to create markets for it. These markets are small and unprofitable, but they are enough for these companies to establish themselves and continue improving the product. Eventually the product improves to the point where it is usable by the low-end of the original market.
At this point a key dynamic develops. The upstarts are in a business with much tighter margins, and are producing a qualitatively worse product. The established players are used to selling into markets with much higher overheads (and those overheads are for things like support that their main customer base needs), and so cannot figure out how to sell to the low-end at a profit. The result is a distinctly one-sided battle for the least profitable established market.
If it stopped there, it would be great for the established vendors. They lose their least profitable business and get to focus on their best customers. This is great for profit margins! But it doesn't stop. The disruptive product keeps on marching upmarket into more market segments, pushing the established vendors farther up the foodchain. (Where they are often eating up people on another iteration of the cycle.)
Whether we are talking about the replacement of sail by steam, steel mills by mini-mills, or the workstation market, this battle has had a predictable conclusion. The established vendors get decimated.
This is the challenge that Sun faces with Linux. They know that they are in trouble, and are losing money. They just lost big chunks of the financial industry. They are trying to figure out how to sell Linux at a profit (they won't - their business model has too much overhead). They have lost a lot of customers and have yet to accept that those people aren't coming back.
They still have opportunities. For instance they can continue pushing into mainframe territory. They can continue killing telco equipment. They have other markets that they can enter.
But their core business is on a one-way track out of the markets that they are in now.
I know you probably don't believe me. But in a few years you can look back on these words and think that I was a prophet. Or you can get that book and learn more details of the dynamic that Sun is trapped on the losing side of.
Java does not require an IDE for most projects. The best code I’ve written has been in vi. This is because of Java’s package architecture and JavaDocs utility. (JavaDocs allows you to markup source code to automatically create GOOD documentation. e.g. <linktoexamplejavadc.html< )
Right. Java is text, and any text editor will work.
But an awful lot of Java shops deploy development environments that autogenerate lots of code from a basic framework. Fairly quickly this can put you under a lot of pressure to use the IDE. (I will agree that code written without the wizards is probably better though.)
As for Perl, remember it’s an acronym. Practical Extraction and Reporting Language. For ‘practical’ extraction and reporting, it’s fabulous... It makes great parsers and is handy for dealing with the odd text file. But for systems programming and application development, it can’t compare to C++ or Java. It’s the wrong tool for the job.
Actually Perl isn't an acronym, that is a backronym.
Furthermore raw Perl is actually bad at parsers. REs are good for finding patterns, not breaking them up in a structured way. If you want to parse then you either have to use another language, roll your own, use a module like Parse::RecDescent, or wait for Perl 6. (Which has completely rewritten how REs work so that they will be good for parsing.)
As for application development, if you are my competitor I hope that you continue believing that. Then again I have had the experience of being on a small team that built in 3 months the equivalent of what a much larger Java team had managed in 2 years. (Ironically when we went to demonstrate it for them, they had a problem showing us theirs because Swing had managed to crash by itself just sitting there.) A year later we came back and in another project duplicated what they had planned to do over the next few years. I don't know what they are doing now...
Let me give another example. A few years ago Paul Graham was running a web-based startup. There he learned that the best way to track other people's development tools was to look for their job ads. As he explains in Beating The Averages, one of the things that he learned is that any competitor who was using Java could be discounted. They weren't an issue. Their development cycle was going to be so heavy and slow that he could duplicate any good ideas that they happened to have within a few days. (His startup was very successful, and was sold to Yahoo! in 1999.)
Your regular expressions have been included since Java 1.4.0 Not quite. Everyone calls their REs "Perl 5 compatible" after they add a few extensions that Perl was first to come up with. But Perl generally has other extensions that the other people don't. Also see this conversation for an amusing example of how Perl still makes actually using RE's easier.
Resistance is futile. Most advocates aren't quite that obvious about admitting to their groupthink.
You will be caffeinated. Indeed. My choice has penguins written all over it. Though these guys will do in a pinch. Oh, you weren't talking about real caffeine? Sorry then, not interested...