I'm currently performing a lot of manipulation of log files, which grow by up to 100MB/day. The records in the files are of the format
and I want to be able to trim out the older records. Currently, I do this by reading the file line-by-line, and writing out the lines after $arbitrary_data to a temporary file, which is then used to replace the current log file.
This takes a considerable amount of time to process, so any suggestions for a more efficient method would be very gratefully received.