Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: Dealing with huge log files 100 MB to 1 GB

by RMGir (Prior)
on May 17, 2010 at 11:59 UTC ( #840338=note: print w/ replies, xml ) Need Help??


in reply to Dealing with huge log files 100 MB to 1 GB

This isn't useful for statistics, but if you need to quickly pull up sections of the large files to diagnose things like "what happened around time x?", File::SortedSeek is incredibly useful.

For your current problem, as everyone else said "precompute" is probably the best answer.

You could probably do something complicated to map/reduce the statistics gathering in parallel across sections of the file, but it's probably not worth it.


Mike


Comment on Re: Dealing with huge log files 100 MB to 1 GB

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://840338]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others browsing the Monastery: (9)
As of 2014-10-23 20:59 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    For retirement, I am banking on:










    Results (129 votes), past polls