It's pretty crude to kill a process because it gets too
big. Any Unix-like OS worth it's salt will do this for
you usually on a user-id basis (man ulimit). Probably
not the best way to go if you can avoid it.
One thing that I'm not too clear on is that you say the
process is returning all of the output at once as a hash.
How does it do this? Perl hashes are typically converted
into some kind of output stream (printed to a temp file,
sent as text over a pipe, etc) before they can be sent
to another process.
Also, is the other process really saving everything to the
very end before it writes the output, or is the receiving
process slurping in the output all at once...
$all_the_output = `my_shell_command`;
If this is what is going on, you may want to try reading
from a pipe instead...
$pid = open(PIPE, "my_shell_command |") or die;
while ($line = <PIPE>) {
if ($i_received_too_much_garbage) {
kill $my_favorite_signal => $pid;
}
}
...
bluto