note
DrHyde
<p>[cpan://Parallel::ForkManager] is certainly a good tool for managing a bunch of processes all under the control of a single "master" process which, in your case, would be the one that reads the 100MB file. However, you need to be careful.
<p>Things to consider include:
<ul><li>How many parallel clients can the database handle before it becomes a significant bottleneck?
<li>What is the overhead of forking - it's almost certainly too high to naively fork a new process for processing each line in the file.
<li>What do you need to do with the data retrieved from the db? While Parallel::ForkManager <em>can</em> return data from each forked process, it fakes this up by going via the disk. Will this turn into an I/O bottleneck?
<li>What is the overhead of connecting to the DB, and how can you reduce that?
</ul>
994602
994602