Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re: Dealing with huge log files 100 MB to 1 GB

by RMGir (Prior)
on May 17, 2010 at 11:59 UTC ( #840338=note: print w/ replies, xml ) Need Help??


in reply to Dealing with huge log files 100 MB to 1 GB

This isn't useful for statistics, but if you need to quickly pull up sections of the large files to diagnose things like "what happened around time x?", File::SortedSeek is incredibly useful.

For your current problem, as everyone else said "precompute" is probably the best answer.

You could probably do something complicated to map/reduce the statistics gathering in parallel across sections of the file, but it's probably not worth it.


Mike


Comment on Re: Dealing with huge log files 100 MB to 1 GB

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://840338]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (10)
As of 2015-09-01 10:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The oldest computer book still on my shelves (or on my digital media) is ...













    Results (370 votes), past polls