Definitely suggest looking for what error-messages are being reported ... and, generally re-consider your design. It superficially appears to me that you are attempting to launch one thread per-file. I will guarantee that you will not succeed in launching 25,000 children. Your code does not anticipate that the fork won’t succeed.
May I kindly now refer you, without further ado, to BrowserUK’s most-excellent first reply to the following recent thread: Proper undefine queue with multithreads. Your situation is exactly the same. His design is different, but easy to do, and his design works.
Incidentally ... in a Un*x/Linux environment ... this whole thing just might be already-done for you! If the xargs command on your system (man xargs) supports the -P maxprocs parameter, then you can bypass all of this nonsense. Simply write (in Perl) a command that expects to receive a filename as a parameter. This command converts that one file, processes it, then deletes it, then ends. Meanwhile, command-line-pipe the output of a find command into xargs which uses that parameter. Now, the find command is posting the filenames to xargs, which farms-out the work to maxprocs identical, single-purpose (Perl ...) process instances. The same business requirement has now been solved, in a “very Un*x-ish way,” and the complexity of your (Perl) script has been greatly simplified. (Perhaps you decide that this alternate approach is “right for you,” or perhaps not, and either choice is up-to-you. But you will see that it is functionally equivalent in its general approach to the problem.)