http://www.perlmonks.org?node_id=587395


in reply to 15 billion row text file and row deletes - Best Practice?

Hi, BrowserUK's reply looks quite good.

I just would like to chip in that if you are willing to chunk your data file in 100MB bites, based on my timing of unix sort on a 100MB file of 9 digit numbers it would cost you 100 hours to get sort all the data first. If you can at the same time ensure fixed length columns it will help.

You really do not want to do any disk I/O since on my machine it takes 7 hours just to read the file once. If you have sorted kill list that is chunked according to the extents of each data chunk you also only have 1/40 of the kill list to worry about (if distribution is even).