|Problems? Is your data what you think it is?|
Re: how apply large memory with perl?by BrowserUk (Pope)
|on Aug 08, 2012 at 10:52 UTC||Need Help??|
When loading large volumes of data, a little care can go a very long way.
All commands wrapped for clarity!
And the same or similar techniques can be used for most every aggregate population task. It is just a case of knowing when to use them.
Most of the time we don't bother because our data sizes are such that it isn't worth the (small) effort; but it behooves us to know when the small extra effort will pay big dividends.
As for the blogger; quite why he feels the need to load all his DB-held data into his program in order to do bread and butter SQL queries is beyond me.
Whilst I don't entirely disagree with the premise that there are times when Perl isn't the right choice; making the fundamental error of pulling all his DB-held data into perl in order to perform processing that he actually describes as "all you need is sum(field), count(field) where date between date1 and date2,", just makes me doubt the veracity of his conclusions.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.