Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine

Re: removing lines from a file

by rev_1318 (Chaplain)
on Jun 22, 2005 at 18:33 UTC ( #469123=note: print w/replies, xml ) Need Help??

in reply to removing lines from a file

If you're on a unix-like system, all your problems could be solved if you can replace your logfile by a fifo pipe. Then you can have the application that produces the logfile write to it and comfortably read from it. After processing, you can do with the lines as you please (write them to an additional (real) logfile or discard them...)


Replies are listed 'Best First'.
Re: removing lines from a file
by izut (Chaplain) on Jun 22, 2005 at 18:54 UTC
    You have some alternatives for this task, but have to see this points:
    - will the log file be rotated?
    - can the application is writing the log write to a fifo?

    If you do the fifo alternative, be careful to some log rotate application DO NOT rotate the fifo. Using the fifo you can easily dismiss the lines you don't want to store. If the application can output to STDOUT, you can redirect STDOUT to SDERR and read it (as STDIN is buffered) and then dismiss all lines you don't bother.

      A program doesn't have to know if it's writing to a FIFO or not, unless you're dealing with networked files, i.e. files on some NFS or SMB like share. The only issue I can see here is that opening a FIFO for writing usually requires a listener, so if your filter application crashes (or you don't start it) your program is likely to have issues (some SIGPIPE in the first case, hanging in the second). Another possible issue deals with the limited buffer between these applications, so you must ensure that your filter program doesn't lose time possibly blocking the log producer.

      The FIFO solution leaves the duty to write the logs to the Perl filter, not to the original application, and the filter is likely to write regular files - something that a logrotate program should not be upset with.

      As for the redirection, I don't really understand how it should work. Redirecting STDOUT to STDERR means that you basically lose them all in the listener application. The pipeline

      producer | consumer
      links producer's STDOUT to consumer's STDIN, so the suggested redirection leaves you with an empty STDIN and nowhere to read log lines from.

      perl -ple'$_=reverse' <<<ti.xittelop@oivalf

      Don't fool yourself.
      Writing to a fifo should be no problem. For the application, it's like writing to a ordinary file. But the log rotation is a good point. There you should be carefull indeed.


        The FIFO idea seems to be the best solution for me here. I can point my application to write to a FIFO pipe, and just disregard all lines I would have previously tried to remove. As far as the log rotation; it seems as though it would be as simple as a 'real-time' rotation. Meaning I would base the logfile output on the time/date stamp prefixing each of the incoming lines. Then create new files on the fly based on those metrics. Would an idea like this be the best solution for my problem?

        cheers, Ev

        Good judgement comes with experience. Unfortunately, the experience usually comes from bad judgement.

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://469123]
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others studying the Monastery: (4)
As of 2023-11-29 15:16 GMT
Find Nodes?
    Voting Booth?

    No recent polls found