Well, for the first time, I'm having to really look at performance in a Perl program, and am a bit stuck.
Basically, I'm trying to capture netflow information from a Cisco router. It's quite active, spitting out about 30+ UDP flow packets a second at peak times, each with 30 flows, for around 1,000 flows a second. I can capture these flows (using Tony Hariman's fdgetall as a base, but as soon as I start to unpack them and do anything useful with them, I start dropping packets. I seem to be doing a much better job by keeping an interal buffer of packet data, and, upon reaching an arbitrary count of say 1,000 packets, forking (zeroing the buffer in the parent) and letting the child process the data. How is such a program typically written?
Ultimately, I'd like to store the flows in a mysql database, with a daemon combining, say, flows over X hours old into 5 minute averages, etc... to keep sizes and searches reasonable.
2001-06-16 Edit by Corion : Changed title