http://www.perlmonks.org?node_id=702373


in reply to Unix shell versus Perl

Let me first say that I almost always prefer Perl over the shell as well. But I don't always agree with your arguments; they are a bit too black and white for me.

Writing portable shell scripts is hard. I don't think it's hard, although it's easy to write something that is not portable. But you have several options available to make porting easier. First is to limit yourself to the POSIX standard. Most Unix vendors support at least POSIX complaince for their shell tools. The second option is to use the GNU tools, and install the GNU tools on all the platforms your program needs to run on. GNU tools have been ported to all major Unix platforms, and most run on Windows as well. Is each external command available on each of our platforms? That's a valid question to ask when porting a shell program, but an equally valid question is to ask whether each Perl module you use is available on each of the platforms you use. This may usually be true for pure Perl modules, but is not always true for XS modules. Specially not if said modules use third party libraries.

Shell scripts tend to be fragile. Running commands and scraping their output is an inherently fragile technique, breaking when the format of the command output changes. My experience is that the behaviour of shell commands tends to change less than the behaviour of Perl modules. Sure, a shell command may output something different if you upgrade it, but a Perl module may change its behaviour as well if you upgrade it. And Perl modules don't have a standard - shell programs do: POSIX. And then there's perl itself. In the 25+ years of programming that I do, I cannot recall a program breaking because of an upgrade of the shell. An upgrade of Perl always tends to break at least one of my programs in some way.

Shell scripts tend to be slow. That's 'the pot calling the kettle black', isn't? Agreed, the shell doesn't shine when it comes to speed, but neither does Perl, does it? Luckily, with modern machines, it's usually the disk or the network (or the memory) that's the bottleneck, so the relative slowness of the language itself doesn't really matter. But in the few cases where it did, I'm always amazed by the speed of C compared to Perl (every now and then I rewrite a Perl program in C for speed reasons).

Shell scripts tend to be insecure. Running an external command in a global environment is inherently less secure than calling an internal function. Eh, not necessarely. Languages don't make programs (in)secure. Programmers do. And frankly, I have more trust in what my vendor puts in /usr/bin than what I download from CPAN. I do not agree that running an "external command" is "inherently" less secure than calling an "internal function". And I've no idea what you mean by "global environment".

As a programming language, shell is primitive. Shell does not have namespaces, modules, objects, inheritance, exception handling, complex data structures and other language facilities and tools (e.g. Perl::Critic, Perl::Tidy, Devel::Cover, Pod::Coverage, ...) needed to support programming in the large. Up to version 5, Perl didn't have namespaces, modules, objects, inheritance nor exception handling. And C still doesn't have them. And depending what you mean by 'complex data structures', C doesn't have them either. As for Perl::Critic, Perl::Tidy, Devel::Cover, Pod::Coverage, I disagree that they are needed - in fact, I wouldn't want to use Perl if they were needed to program in Perl. And in the history of Perl, they are the new kids on the block. People prefered perl over shell long before those tools were first created.

If you're sure the script will "never" need to be ported to Windows. Been, there, done that. Shells (and shell like tools) have been ported to Windows, just as Perl has. And remember the time the perl port on Windows was different from the one on Unix? At that time, that wasn't true for some of the shells. "Oneperl", the perl that merged the Unix and Windows ports only dates from 1998; macperl merged even later than that (and that only because MacOS went Unix).

Replies are listed 'Best First'.
Re^2: Unix shell versus Perl
by eyepopslikeamosquito (Archbishop) on Aug 08, 2008 at 08:03 UTC

    I do not agree that running an "external command" is "inherently" less secure than calling an "internal function". And I've no idea what you mean by "global environment".
    By global environment, I was referring to environment variables (e.g. PATH, IFS, CDPATH, ENV, BASH_ENV, SHELL, TZ, LD_LIBRARY_PATH) and other elements of the execution environment (e.g. umask, inherited file descriptors, temporary files) that are a common source of exploits by malicious attackers. Certainly, executing an external program securely is not trivial: there are many, many security exploits to consider and guard against. That's why I stated that calling an internal function was inherently more secure -- because all these many and varied exploits need not be considered.

    To give a specific example, most shell scripts tend to use the (potentially insecure) $HOME and $SHELL environment variables to ascertain a user's home directory and shell, while a Perl script can get this information via the more secure (and more reliable) getpwnam internal function.

    That shell scripts tend to be insecure is widely known and acknowledged; see, for example, FAQ: How can I get setuid shell scripts to work? and perlsec, which opens with:

    Unlike most command line shells, which are based on multiple substitution passes on each line of the script, Perl uses a more conventional evaluation scheme with fewer hidden snags. Additionally, because the language has more builtin functionality, it can rely less upon external (and possibly untrustworthy) programs to accomplish its purposes.