Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Re: Myth busted: Shell isn't always faster than Perl

by Perl Mouse (Chaplain)
on Dec 31, 2005 at 00:40 UTC ( [id://520091]=note: print w/replies, xml ) Need Help??


in reply to Myth busted: Shell isn't always faster than Perl

I've never heard of the myth "Shell is always faster". Not that your myth busts anything - using the 'exec' option to delete one file at the time is a far from optimal solution. As pointed out, the '-print0' in combination with 'xargs' is much more efficient, as it saves spawning a gazillion processes.

I'm a bit surprised however that no-one so far as piped in the "programmer time is more costly than running time" mantra. Surely, the 2 seconds running time difference are dwarved by all the extra typing you need in your Perl solution. Or are Perl programmers cheap, and shell programmers expensive?

I would always go for the shell solution. I'll have deleted all the files even before you've finished typing your Perl program.

Perl --((8:>*
  • Comment on Re: Myth busted: Shell isn't always faster than Perl

Replies are listed 'Best First'.
Re^2: Myth busted: Shell isn't always faster than Perl
by zentara (Archbishop) on Dec 31, 2005 at 11:59 UTC
    I never type out a script more than once, it goes into a /bin directory in my path. "Damn it Jim, I'm a Perl hacker, NOT a typist" :-)

    I'm not really a human, but I play one on earth. flash japh
      But if you haven't been on the system yet, you haven't had a chance to install your "delete files and leave the directory structure" program yet.

      One way of doing system administration is to write a little program for every minor task you want. A small change, a different program. And then, everyone has to carry disks with their personal libraries around. Granted, it's workable.

      I myself prefer the Unix/POSIX solution. Lots of small tools, that can be stacked like legos. Tools that are everywhere, like find and xargs. When I sit down at a Unix system, I can type

      find . -type f -print0 | xargs rm
      to delete files, and leave the directory structure as is. I don't have to remember whether I installed a program doing this for me on the box, and if I did, how it's called. And I don't need to write a new program if I want to delete all files older than a week - just add an extra option to find. (Sure, you could enhance your program that it takes all kinds of options, but if you have to type as many options to your program as to find, you might as well have used find in the first place).

      I'm not a monoculturist programmer. For anything complex, I write a Perl or a C program (preferably Perl, but that isn't always available - if all you have is a few Mb of RAM and a dozen or so Mb on disk, there's no Perl, but busybox stacks a lot of goodies in just a few kb). But I don't bother writing programs for tasks that I don't do that often and that only require a few simple commands. That's not efficient.

      Perl --((8:>*
        You must be psychic, because I was just thinking that as I read this thread this morning.....I thought ...."hmmmmm I guess guys like Perl Mouse have a point, if you need to move from machine to machine to do quick file system work, find and shell is fast and reliable.

        So you shell guys have a point, you rely on some standard Gnu utlities, and say it is fast in the broadest sense. But my original point, that alot of the shell 1-liners that thrown out as quick solutions, are not neccesarily faster than a Perl script, just because it is C chained together in a pipe. And I do see the shell guys making this claim in the newsgroups, without showing any proof. Thus my original post.

        Personally, I would find it COST efficient to put all my Perl utilities on a USB-keyring-drive, rather than spend the time to learn "arcane" shell syntax. Everytime I look at the way bash shell is done, it blows my mind as being the most confusing syntax that I've ever seen. So I could spend hours trying to confuse myself with shell, where a $25 USB-keyring-drive would let me carry my Perl utilities with me. Efficiency is measured in more than just typing time, there is the economics and mental strain of learning multiple languages that have conflicting syntax styles. Perl, C, PhP, Python, etc. all have 'compatible' syntax, Bash shell is definitely odd.

        I find it admirable that some hackers use different languages according to what is easier to do, but how many syntax errors do they make, when they are juggling shells? Personally I think it is better to try and learn 1 language, and become good with it......yes I only ride a bicycle and I only use Perl. ;-)


        I'm not really a human, but I play one on earth. flash japh
Re^2: Myth busted: Shell isn't always faster than Perl
by demerphq (Chancellor) on Jan 02, 2006 at 09:29 UTC

    I would always go for the shell solution. I'll have deleted all the files even before you've finished typing your Perl program.

    Well, since you are being snarky I'll respond in kind: I doubt it, i reckon youll still be fighting with the shell syntax, and doublechecking that the switches and utilities you got so used to in bash are actually present in the shell you need to run it on. And even then you still wont be 100% confident that it will all work as expected.

    Which to me is the reason that perl scripts beat shell scripts hands down pretty well every time. I can use the same perl script on every shell and OS I can find pretty much. Your shell script will only work on a small subset of them, and will require massive changes for some of them.

    Shell scripts are only worth thinking about if you are a monoculture programmer. Since I'm not I view them mostly with contempt. Who needs shell scripts when you have perl scripts instead?

    ---
    $world=~s/war/peace/g

      I reckon youll still be fighting with the shell syntax

      I can type find | xargs pipes in my sleep.

      doublechecking that the switches and utilities you got so used to in bash are actually present in the shell you need to run it on

      Present in the shell? They’re external binaries; which shell you’re using is irrelevant. Maybe “present on the system,” except that if find, xargs and rm are not present, that is one very broken system. And the -print0/-0 switches are available on these commands on all Unixoid systems where I cared to look.

      And all that is far more likely to be around than perl, in any case.

      If your portability argument concerns moving between Windows and Unix, well, I can see how someone working on Windows would prefer to always use Perl… :-)

      Makeshifts last the longest.

      Well, since you are being snarky I'll respond in kind: I doubt it, i reckon youll still be fighting with the shell syntax, and doublechecking that the switches and utilities you got so used to in bash are actually present in the shell you need to run it on. And even then you still wont be 100% confident that it will all work as expected.
      Bollocks. find | xargs has worked on every Unix system I've used for the last 30 years. Out of the box. In any shell, as the only 'shell' thing here is the pipe, which is universal. It has worked long before Larry released perl1.0, and it will continue to work long after perl5 will be a distant memory.
      Which to me is the reason that perl scripts beat shell scripts hands down pretty well every time. I can use the same perl script on every shell and OS I can find pretty much. Your shell script will only work on a small subset of them, and will require massive changes for some of them.
      The shell solution will work on at least anything that's POSIX compliant. Will your Perl program work in perl6? How would you know - it may work on todays version of perl6, but maybe not on next weeks. As for Perl being present on the OS by default, for many OSses, it's only quite recent that their OS came with some version of perl5 installed.
      Shell scripts are only worth thinking about if you are a monoculture programmer. Since I'm not I view them mostly with contempt. Who needs shell scripts when you have perl scripts instead?
      So, you do everything with Perl scripts, so you're not a monoculture programmer? Interesting. What's your definition of monoculture then?

      But you're right. Once you have a truck, you have no need for a bicycle. It's much easier to start up the truck and find a parking spot, just to get a newspaper from the shop around the corner. It's cheaper as well. Bicyclists are monoculture traffic participants - none of them know how to drive a car.

      Perl --((8:>*

        So, you do everything with Perl scripts, so you're not a monoculture programmer? Interesting. What's your definition of monoculture then?

        Yes, pretty well anything I write that has to be run on multiple enviorments (which in theory is most of what I do) is written in perl.

        Monoculture to me is writing code that expects to be run on/in a certain OS/Shell/Architecture.

        But I will say that your bike/truck point is a powerful one.

        ---
        $world=~s/war/peace/g

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://520091]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others imbibing at the Monastery: (6)
As of 2024-04-19 08:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found