I regularly run scripts to rebuild MySQL tables from scratch and reload them with data records. A quick line count tells me about 1.2 million text records in various combinations of insert/delete/update. The text records are parsed, and submitted as MySQL requests by Perl, one by one through DBI connection without the benefit of prepared statements. It takes roughly 10 minutes to completely process. This is running on my desktop box, nothing fancy. So unless you have a seriously under powered computer (I doubt it) or a network storage limitation or database locking issues, I doubt that MySQL is your bottleneck.
You need to find out where the code is spending it's time. A few well placed print statements should point you in the right direction fairly quickly.