Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Re: Dealing with huge log files 100 MB to 1 GB

by RMGir (Prior)
on May 17, 2010 at 11:59 UTC ( #840338=note: print w/ replies, xml ) Need Help??


in reply to Dealing with huge log files 100 MB to 1 GB

This isn't useful for statistics, but if you need to quickly pull up sections of the large files to diagnose things like "what happened around time x?", File::SortedSeek is incredibly useful.

For your current problem, as everyone else said "precompute" is probably the best answer.

You could probably do something complicated to map/reduce the statistics gathering in parallel across sections of the file, but it's probably not worth it.


Mike


Comment on Re: Dealing with huge log files 100 MB to 1 GB

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://840338]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others perusing the Monastery: (6)
As of 2015-07-05 00:00 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The top three priorities of my open tasks are (in descending order of likelihood to be worked on) ...









    Results (60 votes), past polls