Your skill will accomplish
what the force of many cannot
Re: Unix shell versus Perlby Abe (Acolyte)
|on Feb 19, 2008 at 16:47 UTC||Need Help??|
I have to step in and defend the shell,
Done right, the shell will give you great productivity, better than perl for many tasks
The trouble with the shell is is that programmers often forget structured programming. So one sees splats of shell done again and again to do simple things like file archiving, database access, file loads/unloads, ftps, error reporting, etc.
Any task with complicated programming should be wrapped - in perl, C, C++, or a carefully done shell script or function.
Any task done frequently should go to a library.
Error handling - you can catch any error - you needn't lose anything. But you must NOT redirect scripts' stdout and stderr, except at the topmost level of control. If you're starting processes in the background, their failure can be signalled by touching a defined file, since you can't reap their exit status. Simple but effective.
Meantime for foreground, errors are always sent back to caller in $?, same as perl.
If you have to produce a script for several platforms, perl might be better, I don't know, but this is a discipline in itself. I'm skeptical though.
ksh behaviour is standard - "echo" and "print" are defined. Korn shell 93 seems to be an attempt to do perl things, particularly hashes, which perl does much better.
If you have to migrate platforms, it is fairly easy, but retest - you'd have to do the same for perl.
Korn shell job control is wondrous compared with perl. Just start all your concurrent processes with & on the end, and then "wait" for them. This leads to very simple scripts that can efficiently run complex jobs with maximum concurrency of their different elements - very useful on our big RDBMS. Korn functions you start in the background, anonymous or named, can then further parallelise other tasks. Before you know it, and incredibly simply, you've got everything happening at once, and a 2 hours job is 10 mins. All the clever bits are in SQL, perl, C or whatever, with a couple of hundred lines of korn to glue it together.
Windows .bat files are contemptible. For all but tiny scripts, shell or perl seem essential. What's amusing is how managers intervene to prevent this, insisting on VB or some custom interpreter, produced because perl/shell are "not invented here".
We have a binary for executing SQL on the db in a simple manner, a shell script that wraps db bulk-load/unload effectively for our platform, and a couple of shell libraries to wrap, standardise, and de-skill things like ftp, errors and warnings, "ps" process checks, plus one or two other libraries and perl utils for larger, standardisable things
Kornshell isn't suitable for complex programs, but properly done, it can simplify, speed up and provide efficient glue.
Kornshell does need proper programming discipline.
Perl is still fantastic, but I find kornshell much better for these jobs, glueing utilities and libraries quicker and better than perl.
Incidentally it's perfectly easy to make similar mistakes in perl. A lot of our perl uses OO with too many inheritance levels - the code is so dense it's more like looking at COBOL - quite an achievement.