Problems? Is your data what you think it is? | |
PerlMonks |
Re: Performance Questionby talexb (Chancellor) |
on May 08, 2002 at 13:49 UTC ( [id://165029]=note: print w/replies, xml ) | Need Help?? |
That's a tough question to answer without knowing a few more variables.
Jumping ahead to a solution, I would probably slice the monster file into pieces (lots of ways to do that) then process a couple of pieces in paralell. The way I would test that would be to take a 1G slice of the file and pretend that's the big file, and try various different piece counts. Failing that, write a program in C (something I've done many times) to suck the file in, 64K chunks at a time (or whatever size chunks your system can manage), then process the lines individually. The processed lines go into a 64K buffer, and when it gets full, you write it to the output file. Piece of cake. :) And you should get great performance doing it in C, better than Perl. --t. alex "Nyahhh (munch, munch) What's up, Doc?" --Bugs Bunny
In Section
Seekers of Perl Wisdom
|
|