Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: Dealing with huge log files 100 MB to 1 GB

by RMGir (Prior)
on May 17, 2010 at 11:59 UTC ( [id://840338]=note: print w/replies, xml ) Need Help??


in reply to Dealing with huge log files 100 MB to 1 GB

This isn't useful for statistics, but if you need to quickly pull up sections of the large files to diagnose things like "what happened around time x?", File::SortedSeek is incredibly useful.

For your current problem, as everyone else said "precompute" is probably the best answer.

You could probably do something complicated to map/reduce the statistics gathering in parallel across sections of the file, but it's probably not worth it.


Mike
  • Comment on Re: Dealing with huge log files 100 MB to 1 GB

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://840338]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others meditating upon the Monastery: (3)
As of 2024-04-20 03:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found