![]() |
|
laziness, impatience, and hubris | |
PerlMonks |
comment on |
( #3333=superdoc: print w/replies, xml ) | Need Help?? |
Two writers on the same file seem a Bad Thing to me if there is not cohordination between them. And you basically can't tell the application to stop while you write, or be sure of a time slice to do your job (because the file is growing fast, as you say).
You probably need a filter that does not work in-place. You should open another output file, write all lines that you think are good to it and eventually get rid of the original log file when you're sure that you can do it. In this way, you won't be leaving space to data lost (e.g. when you truncate the file just after the application wrote such an important log line, before you had the possibility to read it). Another option would be using a pipe to do the job: the application writes data to the pipe, you read it and do your filtering. This requires either that you're able to divert log messages to standard error (for example), or that you use a named pipe. The first option could be tricky (maybe standard error is already used), and the second one requires extreme attention because if you don't open the named pipe for reading your application will hang. I can try to elaborate once you show some interest into these options. Flavio In reply to Re: removing lines from a file
by polettix
|
|