We don't bite newbies here... much | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
The idea is, you save the size of the file that you've checked at one point of time, and then next time you skip this much of data before you start looking for your string. Then you don't need to modify anything in the file. The easiest way to do it is to let perl run forever, letting it sleep for 2 hours and then repeating the loop. Then you can keep the old file size simply in a variable. Beware, if you restart the script, you'll loose the offset. If you insist on running perl from crontab, then you'll have to save the size in some file and then on startup read that file to obtain the offset of data where searching should start. Note, there's still one catch here. Most likely the log file is rotated periodically, so your program should be smart enough to notice that log file was rotated and reset the saved offset. As to your original approach with replacing strings, this also can be done, but this requires opening file in "read-write" mode open(FILE, "+<mylog.file"). Then you would mark position of the string which you want to replace, then seek to that position using seek function and then write the new string. Though you should be careful not to change the length of the string, otherwise you'll corrupt other data as well. After you are done writing, you would seek again to the position where you stopped reading and continue. As you can see, that approach is somewhat complex, plus it is inefficient, because you'll have to scan through the same areas multiple times. The only benefit here is automatic handling of log rotation. In reply to Re^3: Search logs with crontab
by andal
|
|