If you are combining stdout and stderr on the command line, then you can do the equivalent thing by opening a single file handle and writing to that during execution. The long-running (only a few minutes) script I'm currently working on just writes log messages to what I'm calling the error log, and a very few status messages really do go to stdout. The stuff to stdout is just to reassure myself that the script's actually doing something, and is throwaway stuff. The error log contains valuable information, and I find myself grepping for various phrases, or for specific policy/coverage combinations in case I want to track how a specific case was handled.
And from a philosophical point of view, I'm not a fan of combining stdout and stderr into a single stream, because they likely contain different streams of information .. stdout is going to contain boring things like "I'm doing foo now ..", while stderr might contain more important stuff like "Rule 17 broken in record 34567, bf=17.76" .. but again, this is totally up to you as the developer/sysadmin.
OK, so that might have been a little off-topic .. if you are working on running some commands and collecting the output, I believe qx is the command you want; it runs the command, returns a list of lines output. I would probably log that as
Running <some command>, output is
>> Output line 1
>> More output
>> final output
This may or may not be what you want, as you don't get the output until the command has finished. I you'd rather have output back from the command as it runs, then you may have to go to something like IPC::Run. I see there's even something called IPC::Run::Fused which glues stdout and stderr together.
Anyway, have a look at those modules -- I used the former many years ago, and it worked brilliantly.
Alex / talexb / Toronto
Thanks PJ. We owe you so much. Groklaw -- RIP -- 2003 to 2013.
|