Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: Unix shell versus Perl

by starbolin (Hermit)
on Feb 20, 2008 at 07:21 UTC ( [id://668963]=note: print w/replies, xml ) Need Help??


in reply to Unix shell versus Perl

eyepopslikeamosquito writes:

"Shell scripts tend to be insecure. Running an external command in a global environment is inherently less secure than calling an internal function. Which is why most Unices do not allow a shell script to be setuid."
Suidperl of questionable status. While some have said it should be considered depricated: http://www.xray.mpe.mpg.de/cgi-bin/w3glimpse2html/perl5-porters/2008-01/msg00949.html

Others are interested in keeping it alive. Indeed, Perl ships without setuid compiled on some distros.

A lot of the negatives built into some of the shells can be seen in differing degrees in perl. It's interpreted nature means it can be slow. It's eclectic feature set means it can be difficult to audit. It's cooperative nature means scripts can call questionable binaries. It's comprehensive nature means it's footprint is large. So it would be dangerous to attack shells based on their architecture.

"Shell scripts, being interpreted and often having to create new processes to run external commands to do work, tend to be slow."

Perl is also interpreted. Although I think that is not the crux of your argument I would chose different wording.


s//----->\t/;$~="JAPH";s//\r<$~~/;{s|~$~-|-~$~|||s |-$~~|$~~-|||s,<$~~,<~$~,,s,~$~>,$~~>,, $|=1,select$,,$,,$,,1e-1;print;redo}

Replies are listed 'Best First'.
Re^2: Unix shell versus Perl
by mr_mischief (Monsignor) on Feb 22, 2008 at 18:16 UTC
    Perl is not interpreted. At least it is not in the same sense as shell. Shell scripts are source-level interpreted. Perl programs go through a full-scan compilation into another form which is then executed. It isn't translated directly into machine code, but syntax errors are caught at compilation time and not run time. The process does happen generally every time you run a program, but that does not mean it doesn't happen.

    Nearly every useful language has the ability to execute external commands. Most shells rely on that for most of their utility. If you think Perl is insecure because it can call an external binary, then what of shell, which often must?

    Also related to how shell is used is your footprint comment. How is Perl's footprint compared to every Unix command available on every version of Unix? Undisciplined shell programmers can use every CLI-based program on the system to get their work done. They often must use the search path to attempt any kind of portability, and can be effected by other environment differences. An undisciplined Perl programmer who uses every dark corner of Perl can at least expect some compatibility among different installations of the same version. If you're depending on external tools for most of your functionality, you can't count on much of anything.

    Shell doesn't have an eclectic feature set? Again, you're talking about the entire installed command set of whatever machine you're on, often depending on the search path and environment variables.

    Perl has its weaknesses, but I don't think you're assessing them fairly here. Perl and most shells are worlds apart.

Re^2: Unix shell versus Perl
by eyepopslikeamosquito (Archbishop) on Feb 23, 2008 at 05:01 UTC

    As mr_mischief has already pointed out, Perl scripts are first compiled to an internal form, ensuring syntax errors are caught at compile time rather than run time ... and considerably speeding up script execution at run time.

    To give a specific example of where shell can be slow, consider the common task of running an external command recursively on all files under a given directory. In shell, you could use the find command with its -exec option, but that results in a new process being created for every file. I've measured cases traversing thousands of files where such an approach proved to be hundreds of times slower than using Perl's File::Find module in harness with a Perl function performing the work of the external command. The performance difference is especially noticeable when the external command doesn't do much work. Of course, the way to solve this performance problem in shell is to use find in harness with xargs, but there are portability pitfalls for the unwary, namely when the filenames contain spaces or other unusual characters -- as indicated in a response to a 2002 gnat use.perl.org journal entry. Update: broken link, these are the commands:

    find . -print0 | xargs -0 ls -l (GNU only) find . -print | sed 's/ /\\ /g' | xargs ls -l (works everywhere)
    Found an archived link: gnat use.perl.org journal entry (see response by oneiron)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://668963]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others having an uproarious good time at the Monastery: (3)
As of 2024-11-08 18:22 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    chatterbot is...






    Results (33 votes). Check out past polls.