http://www.perlmonks.org?node_id=904463


in reply to Re: Working on huge (GB sized) files
in thread Working on huge (GB sized) files

... yes, and if the number of records is huge, such that the memory consumption of in-memory data structures is problematic (highly unlikely, these days...), you can also throw the keys into separate files, sort the two files identically using an external disk sort, and process the two streams in a single sequential pass ... a classic “merge” operation.   (What worked in the days of COBOL still works today.)

If the “huge files” happen to be XML files, modules such as XML::Twig are designed to work with files even of that size, without overwhelming memory.