If your 12 million records average less than a couple of kbytes each (ie. if the size of the records file is less than your available memory)
Is 12GB a normal amount of memory for a single process to use these days? My sense was that 4GB was standard on an entry-level desktop or a mid-level laptop. Even if you have a super-machine with 16GB, you may not want to have a single process suck that all up to run an O(n^2) program. A hash containing the smaller file or two on-disk sorts would be a much better option, and not that hard to do.
Just another Perler interested in Algol Programming.