in reply to
randomising file order returned by File::Find
How about having a single script dig through the directory tree and link the items it finds to 100 output directories. Then each of your 100 processing scripts can troll through their own input directory for items to process. That way you can avoid most of the locking problems. If you make your script that assigns items to queues, it could even watch the output directories and generate image links in the smallest directory available to balance the load (in case some process more quickly than others).
When your only tool is a hammer, all problems look like your thumb.