in reply to Re^2: Handling large amounts of data in a perl script
in thread Handling large amounts of data in a perl script

Okay, so you're dealing with about 10^5 * 10^3 * 10 items, i.e. about a giga-item. On a sufficiently powerful machine, you can fit them all in memory at once if they're just integers (~4GB or ~8GB). If they're not, say if you make them "objects," they won't fit comfortably.

Now you have to ask yourself whether you need to process them all at once or sequentially. If you have to process them all at once (e.g. sorting), you'll have to do something clever. Otherwise (e.g. finding the mean), you can just run through them one at a time, updating some state in your program, e.g.

while ($age = next_age()) { $ages += $age; $n++; } print "average = ", $ages / $n, "\n";