http://www.perlmonks.org?node_id=994680


in reply to Parallel processing with ForkManager

You'd almost certainly be better off parsing the names from the file and bulk-loading them into a temporary table within the DB. Then issue a single join to select the information you need into another temporary table; and finally dump that to a CSV for further processing if needed.

The database is likely to make far more effective use of the threading available to it that way, than having to serialise thousands of concurrent (effectively identical) queries from 64 different clients.


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.

RIP Neil Armstrong

  • Comment on Re: Parallel processing with ForkManager