http://www.perlmonks.org?node_id=139294

deprecated has asked for the wisdom of the Perl Monks concerning the following question:

I run two services which produce log files on the scale of multigigabytes: Postgres and http://opennap.sourceforge.net/opennap. Taking huge zipped up tarballs of these logs and parsing them during off hours or when I come home from work gets tedious. I'd rather just have a perl daemon set up that actually read them as they were generated and stuffed em in a database in the format I want.

So the approach I came up with was using mkfifo(1) and this eensy-weensy loop:

#!/usr/local/bin/perl open FIFO, "<foo" or die "$!\n"; while (<FIFO>) { chomp; print "$_\n"; }
The problem is, this doesnt actually stay open and alive. It dies after the first "echo 'hello' > foo".

Is there a way I can make perl "listen" to a file and actually have it be parsing data and doing inserts on a db?

thanks,
brother dep.

--
Laziness, Impatience, Hubris, and Generosity.