|Just another Perl shrine|
How to deal with Huge databy Marsel (Sexton)
|on Jan 23, 2007 at 12:06 UTC||Need Help??|
Marsel has asked for the
wisdom of the Perl Monks concerning the following question:
I'd like to merge different data matrices into one. The input are text files, with a header as a first line with the names of the cols, and on each following line, a identifier, then the numerical data for each row.
The cols are different in each file (but not always ...) and the rows are almost the same in all the files (but not always ...). The total size is about 10000 cols, and 55000 lines.
The simple way to explain what i want to do is the script below :
But in fact, even with 8Go of RAM and 8Go of swap, it's too much, due to the space needed to store a hash i think. So do i have to leave perl there, and go to C, or is there another solution ?
Thanks a lot in advance for any clues,