Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

Re: Re: Picking up where you left off..

by submersible_toaster (Chaplain)
on Oct 14, 2002 at 07:16 UTC ( [id://204991]=note: print w/replies, xml ) Need Help??


in reply to Re: Picking up where you left off..
in thread Picking up where you left off..

:( Sadly I don't think I can wrangle FIFO's to my needs. I read the File::Tail docs (why does it need Time::HiRes?) , which would be cool for a daemon type execution, but hoping I was to only run this script at intervals.

Closer inspection of Perl InANutShell led me to this...

use strict; open ( COUNT , '<counter'); my $whence = scalar <COUNT>; chomp $whence; close COUNT; open ( FH , "<logfile" ); # scram to position we wrote last to 'counter' seek FH, $whence , 0; my $line = scalar <FH>; print $line; my $count = tell FH; open ( COUNT , '>counter' ); print COUNT $count; close COUNT;
Which exhibits the behaviour(s) that I believe are needed, in this test case, the script prints the next single line from 'logfile' and exits - saving it's position in logfile to 'counter'. OK so I had to seed counter with '0' first!!

This is destined to run on rh7.3, perl5.6.1 - what scares me most is how squid will behave with another process reading from the logfile it's writing too.

Replies are listed 'Best First'.
Re: Re: Re: Picking up where you left off..
by blm (Hermit) on Oct 14, 2002 at 07:51 UTC

    Why are you so dubious about opening the squid log for read access while squid writes to them? People do this sort of thing all the time. For example may people sit with a xterm open doing nothing but tail -f logfile. Just imagine if it was harmful: "Just a sec I will check the log file for diagnostic messages.Oh oops! I have to rotate the log files/stop then restart the daemon!". Yuk.

    If you do start scanning the current log at random intervals starting from the file position that you got up to in the previous scan you will need to take into account the default cron jobs that rotate the logs daily (I think). Have a look at the cron jobs on the machine and (at least on some linux machines) /etc/logrotate.conf.

    --blm--

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://204991]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others avoiding work at the Monastery: (6)
As of 2024-04-16 11:55 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found