pedrete has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks...

i need to run many externals commands from my Perl script and capture both return-code and STDOUT. These commands must run under a shell...

To my knowledge, Perl always does a fork and then waits for the child to run the command, am i right?

Is there any way to prevent that "fork"? i am looking for the "cheapest way" in terms of CPU and Memory load...

THX!!!

Pedreter.
  • Comment on Cheapest way for multiple external commands

Replies are listed 'Best First'.
Re: Cheapest way for multiple external commands
by haukex (Chancellor) on Jan 24, 2020 at 19:35 UTC
    Is there any way to prevent that "fork"?

    AFAIK, no. See exec, system, and qx.

    i need to run many externals commands from my Perl script and capture both return-code and STDOUT. These commands must run under a shell...

    Be aware of The problem of "the" default shell. I also wrote about running external commands here. In this case I'd suggest something like IPC::System::Simple (note it doesn't capture STDERR, for that I'd suggest a different module):

    use IPC::System::Simple qw/capturex EXIT_ANY/; my $out = capturex '/bin/bash', '-c', 'echo foo'; # or, if you don't want it to die on nonzero exit: my $out2 = capturex EXIT_ANY, '/bin/bash', '-c', 'echo bar && false';
    i am looking for the "cheapest way" in terms of CPU and Memory load...

    Without more information, this smells like premature optimization. What OS? What commands are you running; could they perhaps be replaced by Perl code? How many commands per second are you running? How fast do you need it to run, and have you written a prototype and measured its performance to see if it's good enough or not?

      Thanks everybody.... i was just looking for to prevent the "fork" but i am afraid it is un-avoidable...
        Looking to avoid the fork indicates you don’t quite have a handle on Unix process design. It is impossible to start another process on Unix without first forking a program; the child process then calls exec to become the other program of your choice. there is no equivalent to Win32 CreateProcess (though you could use inter-process communication to ask some other daemon to fork/exec for you)

        What it sounds like you actually want to avoid is the call to “wait” that blocks the parent process until the child has ended. The module IPC::Run is a good way to run asynchronous processes and capture their output.

Re: Cheapest way for multiple external commands
by karlgoethebier (Monsignor) on Jan 25, 2020 at 09:24 UTC

    As far as I remember IPC::Run can be used for external commands without calling a subshell. You may see Re^3: Reading output of external program without Shell, a node where haukex was involved. Regards, Karl

    «The Crux of the Biscuit is the Apostrophe»

    perl -MCrypt::CBC -E 'say Crypt::CBC->new(-key=>'kgb',-cipher=>"Blowfish")->decrypt_hex($ENV{KARL});'Help

Re: Cheapest way for multiple external commands
by mhearse (Chaplain) on Jan 24, 2020 at 23:12 UTC
    I would say that the cheapest way to run commands is not to run them. You'd be surprised at what you can harvest from /proc and /sys. But if you must run bulk commands, it will be all about reducing system calls