in reply to Fork Results in thousands of processes
Modules such as Parallel::ForkManager might be useful.
CPAN modules like these are “simple sugar,” to be sure, in that they do not do anything that you could not do for yourself, but they do make life very easy ... they let you focus more on what you want to do, and less on exactly how to do it. The example in the documentation is virtually identical to what you are trying to do here.
You might also wish to search http://search.cpan.org for modules like Thread::Queue. (There are, at the moment, 33 such modules that are found by that search.) Because it may well be that you want to start a limited number of processes (say, ten ...) and then to have each of those processes consume an arbitrary number of work-requests that you have provided for them in a queue. In other words, by launching (say...) ten workers, you ensure that ten requests at a time will be handled simultaneously; and, by stuffing (say...) 1,000 names or commands into a queue, you provide the total list of commands that will be carried out, cooperatively, by the workers in that thread-pool. Stuff the queue full of filenames, followed by enough “please die now” instructions for all of the workers to eventually receive one. Then, simply wait for them to die off on-command.
The nice thing about a design like this, is that it has a very convenient throttle. You can set the number of worker threads up or down, and this has nothing to do with the amount of work that is to be accomplished. Even if the backlog grows very large indeed, the completion rate will remain steady and predictable and can be effectively “tuned” to suit the hardware.