It took me some days off and rethinking to guess at your real bottleneck. Did you forget to tell us about the low bandwitdth remote windows network shares you are scanning? 300 local disks is too much to believe. Then collect the data locally into files and transfer them to the server. The parallel scanning should take about a minute for these file systems just above desktop size.
in reply to Re^2: Multithreaded Script CPU Usage
in thread Multithreaded Script CPU Usage
But after solving all the scanning performance issues you are finally bound by the insert limit of your database. 80 million records are very huge. 1000 inserts per second would be my estimate for mysql on a desktop. That is 22 hours in your case. But do your own benchmark for multiple and faster CPUs and disks.
Back to the drawing board you need to eliminate the overhead of the database. We don't know anything about use cases of your database output. When it is just finding a file once per week, I'd cat the local output files together and grep the 800 MB text file at disk speed in less than 10 seconds. In more advanced cases Perl regexps should have lower overhead than SQL on queries and no insert limit bottleneck.