I cordially advise that you are going about this the wrong way ... that “parallel tasks/threads” will not make things go faster, but substantially slower. You should be using the bulk-loading tool that Oracle provides for this purpose, since it is optimized to do exactly this.
If you find that for whatever reason you do need to load large amounts of data into an SQL database (of any sort) by conventional means, there are several things that I suggest you consider:
- Parallelism is probably not your friend, probably will not help, because it just makes more work for the SQL engine at the other end of the wire.
- You must do the work within a transaction, and you should carefully choose that transaction’s “isolation level.” (This sort of work needs to push other concurrent work aside as much as possible, through an aggressive choice of isolation level, which is why it is often done in the wee hours...) Post a few thousand records, say, then commit, then open up another one. (This will allow the database to do “lazy writing.”) prepare the statement-handle, with placeholders, and use it repeatedly.
- Consider how the process can be made restartable. Once you have successfully committed, record where you are in the input-file so that you could potentially restart at that checkpoint if the program should unexpec... ;-)
But, once again, there are “bulk loader” programs that are specifically designed to do this.