I have this script that tails a file and makes db inserts based on the parsing it does on that file. There is one file generated per day, but I would like to Continue this same process, but actually write the raw data to a file that is split up into hourly chunks.
What is a good way to tell the system that when the script starts, open a file labeled (currentTime).curr, and then at the end of the hour, rename it to just (previousHour) and open a new file (currentTime).curr, and so on...
This is what I am doing so far. db_record handles the inserts, so I figure right before there I should handle the open and close of files as well as the writing.
if ($connected) {
$timestamp = time;
if ( $timestamp < $midnight ) {
log_notice("Client : Restarting $0\n");
log_error("End Processing $src_cdr_file\n");
exec '/home/$0'
|| log_warn("Client : Could not exec $0\n");
exit 6; # Something is wrong if this exit is taken
} # End if $timestamp
log_notice("Client : Normal Termination\n");
log_error("End Processing $src_cdr_file\n\n");
exec '/usr/bin/perl', '/home/$0'
|| log_warn("Client : Could not exec $0\n");
exit 7; # Something is wrong if this exit is taken
} # End if $connected
}; # End anonymous sub
print $socket "tail\n"; # Rock n Roll
open( STDOUT, "> /dev/null" );
# Begin processing records
while ( defined( my $line = <$socket> ) ) {
if ($recover_mode) { $rec_num++; }
db_record( $sth, $line, $recover_from );
}
Something I was thinking about though was the fact sometimes it may not hit right on the HH:MM:00 and not cut the files appropriately. Thanks for any suggestions!