Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl Monk, Perl Meditation

Re: Bulk data insertion into MySql

by Marshall (Abbot)
on Sep 03, 2011 at 00:02 UTC ( #923953=note: print w/replies, xml ) Need Help??

in reply to Bulk data insertion into MySql

Your code looks like straight DBI. The easiest way to make this a LOT faster is to make the whole operation one single transaction.

$dbh->do("SET autocommit = 0"); .... $dbh->do("COMMIT");
I took the above "SET autocommit = 0" from some MySQL code that worked in my app. If you commit every 1/4-1/2 million records or so, the performance will be fine. There can be other ways to start the transaction, perhaps BEGIN or other keywords.

I do find this map pretty ugly:

map { my($keys,$value)=split /\s*:\s*/,$_; $keys=~ s/\s//g; $info{$keys} = $value; }split /\n/, $record; #Don't use map when you mean foreach (or for) foreach (split /\n/, $record) { .... }

Replies are listed 'Best First'.
Re^2: Bulk data insertion into MySql
by sowraaj (Novice) on Sep 03, 2011 at 12:31 UTC

    Dear friend,

    I tried the method Auto_commit& Commit in my coding but i noted that there is not large difference compare to my previous code. If you know any method which is helped to enhance the insertion time by bulk loading . (Now i insert 10,00,000rows/2min)I would like to reduce that time to seconds


      If you can already insert one million rows in two minutes, that seems like a very good speed to me.

      Your task looks like a one-time script to import ("bulk load") a number of files into the database. Why do you need it faster? How often do you need to run it?

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://923953]
and not a whimper to be heard...

How do I use this? | Other CB clients
Other Users?
Others chanting in the Monastery: (4)
As of 2018-05-26 14:55 GMT
Find Nodes?
    Voting Booth?