ethrbunny has asked for the
wisdom of the Perl Monks concerning the following question:
For about 5 years I've been using a set of scripts to download data from 2-300 remote sites and copy to a central DB. The main script forks 12 sub-scripts each of which brings down data from a specific location and uses DBI to write it locally. Once the data is copied, the sub-script exits. The main script cycles through the list running these 12 at a time. This has been all well and good with MySQL 5.1 and 5.5.
Page forward to about a month ago when I upgraded to v5.6. When running this same process it gets through 20-30 of the sub-scripts and then rejects every other connection until all remaining of the 2-300 have been finished. If I restart it the results are the same (20 or so and then errors for the rest).
Looking at the processlist on MySQL shows no more than 25 jobs at any given moment. There are no errors in the MySQL logs. If I set the fork limit to 5 jobs it gets through all of them without failing.
Suggestions on places to look for clues as to what is happening?