Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re: Dealing with huge log files 100 MB to 1 GB

by RMGir (Prior)
on May 17, 2010 at 11:59 UTC ( #840338=note: print w/ replies, xml ) Need Help??


in reply to Dealing with huge log files 100 MB to 1 GB

This isn't useful for statistics, but if you need to quickly pull up sections of the large files to diagnose things like "what happened around time x?", File::SortedSeek is incredibly useful.

For your current problem, as everyone else said "precompute" is probably the best answer.

You could probably do something complicated to map/reduce the statistics gathering in parallel across sections of the file, but it's probably not worth it.


Mike


Comment on Re: Dealing with huge log files 100 MB to 1 GB

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://840338]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chilling in the Monastery: (9)
As of 2014-12-29 15:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (191 votes), past polls