Beefy Boxes and Bandwidth Generously Provided by pair Networks Bob
Think about Loose Coupling
 
PerlMonks  

Re^2: Creating a comprehensive log file

by juo (Curate)
on Dec 16, 2004 at 01:05 UTC ( [id://415293]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re: Creating a comprehensive log file
in thread Creating a comprehensive log file

I created a customized log file which contains information as I want it to get into the log file. Some example can be seen below.

WriteLog("PROJECTCONTROL","INFO",'ID1252',"Project Name is \<$project_ +name\>"); WriteLog("PROJECTCONTROL","INFO",'ID1250',"Datasource of this job was +\<$datasources\>"); sub WriteLog { my ($script,$type,$id,$log) = @_; $logfile = $job_path.'/user/LogFile.txt'; my $logdate = $today; my $loguser = $USERNAME; open (LOG, ">>$logfile") || die "File does not exist\n"; print LOG "$script $type $id $loguser $logdate $log\n"; close (LOG); }

This works very nice and I can now from the log file know what exactly is happening and where something is in the process and which script is running and if something failed upto which point something had ran fine. However in case something fails what I will not get into this file is the failure message which will be returned by Perl, or an application or anything else, which is mostly returned to the STDOUT screen. So I see they are some modules maybe available to do this but I would like to stick to my customized log file and just would like to add the warning and errors which would be provided by Perl itself to it. I don't know if this is possible in an easy way.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://415293]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.