in reply to Cheapest way for multiple external commands
Is there any way to prevent that "fork"?
AFAIK, no. See exec, system, and qx.
i need to run many externals commands from my Perl script and capture both return-code and STDOUT. These commands must run under a shell...
Be aware of The problem of "the" default shell. I also wrote about running external commands here. In this case I'd suggest something like IPC::System::Simple (note it doesn't capture STDERR, for that I'd suggest a different module):
use IPC::System::Simple qw/capturex EXIT_ANY/; my $out = capturex '/bin/bash', '-c', 'echo foo'; # or, if you don't want it to die on nonzero exit: my $out2 = capturex EXIT_ANY, '/bin/bash', '-c', 'echo bar && false';
i am looking for the "cheapest way" in terms of CPU and Memory load...
Without more information, this smells like premature optimization. What OS? What commands are you running; could they perhaps be replaced by Perl code? How many commands per second are you running? How fast do you need it to run, and have you written a prototype and measured its performance to see if it's good enough or not?
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: Cheapest way for multiple external commands
by pedrete (Sexton) on Jan 27, 2020 at 09:56 UTC | |
by NERDVANA (Beadle) on Jan 27, 2020 at 21:06 UTC | |
by afoken (Canon) on Jan 27, 2020 at 23:20 UTC |