http://www.perlmonks.org?node_id=1069849


in reply to Handling large amounts of data in a perl script

That sounds like some serious over-engineering for just averaging 100 ages. Do you actually have something much larger and more complex in mind?
  • Comment on Re: Handling large amounts of data in a perl script

Replies are listed 'Best First'.
Re^2: Handling large amounts of data in a perl script
by sjwnih111 (Novice) on Jan 08, 2014 at 19:27 UTC
    I'll be working with about 100,000 directories each containing several subdirectories that each contain a few thousand files or fewer files. The calculations themselves will be simple.
      How is the large number of dirs and files relevant to "making each person an object" or to educated_foo's observation and question?
      Come, let us reason together: Spirit of the Monastery
      Okay, so you're dealing with about 10^5 * 10^3 * 10 items, i.e. about a giga-item. On a sufficiently powerful machine, you can fit them all in memory at once if they're just integers (~4GB or ~8GB). If they're not, say if you make them "objects," they won't fit comfortably.

      Now you have to ask yourself whether you need to process them all at once or sequentially. If you have to process them all at once (e.g. sorting), you'll have to do something clever. Otherwise (e.g. finding the mean), you can just run through them one at a time, updating some state in your program, e.g.

      while ($age = next_age()) { $ages += $age; $n++; } print "average = ", $ages / $n, "\n";