Hi,
In windows I am using a nice piece of perl code to tail a log file. It does a nice job until the file becomes very large, ie over 20 meg, then the perl process starts to be a memory hog. I tried using a cygwin tail, but it locks the file so the original process can't write to it. This perl code does NOT lock the file. Can someone suggest how I might grab new chunks of a large file efficiently ? The current code is like this
while(-s $file ) {
open(TF,$file) || last ;
seek(TF,$curpos,0);
@lines=<TF>;
$cursorposition = tell(TF);
last if ((stat(_))[7] < $cursorposition);
foreach $nline (@lines){
$newline = substr $nline, 24;
# write out to tailfile $newline
}
close(TF);
sleep 1;
}
}
else
{ sleep 5; }
}
Thanks