http://www.perlmonks.org?node_id=1006783


in reply to Re: "Out of memory" problem
in thread "Out of memory" problem

Agree with BrowserUK ... 500 million integers is a lot to index, and if you aren’t searching for anything, it’s pure overhead to get “sorted answers” that way.   But a good external sorting package would have no particular difficulty.

Ideally, you would arrange the whole data-processing flow which includes this file so that everything gets put into a known sort-sequence early and things are done in such a way as to keep it that way from one step to the next.   So you might have a 500 million record master-file which is simply “known to be” sorted, and you manipulate that file in ways that require it to be that way and which keep it that way.   This avoids searching, and it avoids repetitive sorting.   It also avoids indexes and the overhead of the same.   At the same time, though, you do not want to schleb a bunch of data through disk-reads and disk-writes if you are not actually doing anything with most of it.

Obviously, RAM is the fastest resource and it avoids I/O entirely ... provided that virtual-memory swapping is not going on, which can be killer.   Your strategy entirely depends on your situation, and sometimes you can get a lot of mileage simply by chopping a large file into smaller chunks so that each one does fit in the RAM that you have without swapping.

The key here is ... “without swapping.”   If you are doing high volume processing “in memory” to avoid I/O, but push the limit so that you start to swap, not only is “I/O going on,” but it can be of a particularly murderous kind.