|Just another Perl shrine|
Re: Multithreading a large file split to multiple filesby BrowserUk (Pope)
|on May 14, 2018 at 22:07 UTC||Need Help??|
Can I make it run on multiple cores so it runs faster?
Short answer: no.
The logic of your code dictates the records in the input file are read in strict first to last sequence. Thus, any overhead from switching threads or sharing state is additional time to that required for processing.
Even the code towards the end of the loop, is dependent on state changes earlier in that loop.
And with 15GB of input, there isn't even any mileage in accumulating output in memory to avoid disk thrash.
It's doubtful if even MCE can help you with this.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority". The enemy of (IT) success is complexity.
In the absence of evidence, opinion is indistinguishable from prejudice. Suck that fhit