Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much
 
PerlMonks  

Inserting large chunks of data into MySQL

by askgetanswer (Initiate)
on May 04, 2008 at 14:54 UTC ( [id://684434]=perlquestion: print w/replies, xml ) Need Help??

askgetanswer has asked for the wisdom of the Perl Monks concerning the following question:

when i try to split about 10,000 array of strings and try to insert into mysql - i get server error but less 1,000 works fine - here is the code please help - thanks
my @txtquearr = split('~', $txtque); ###################################################################### +######### my $count = 0; foreach $txtque (@txtquearr){ $count = $dbh->do("INSERT INTO ques (username,userque,userquedetail,us +erquecatgory,userresponsecount,quegood,quebad,queid,userquedate,quevi +deo,queaudio,quepic,email)values('$usernamenow','$txtque','$txtdetail +','$cat','$userresponsecount','$quegood','$quebad','$aaqueid','$userq +uedate','$txtvideolink','$txtaudiolink','$txtpicturelink','$txtemail' +)",undef,$usernamenow,$txtque,$txtdetail,$cat,$userresponsecount,$que +good,$quebad,$aaqueid,$userquedate,$txtvideolink,$txtaudiolink,$txtpi +cturelink,$txtemail); print "Ques Inserted inoto Data\n"; print $txtque; }

May 05, 2008 at 12:54 UTC Janitored byMcDarren, Original content restored.

Replies are listed 'Best First'.
Re: Inserting large chunks of data into MySQL
by moritz (Cardinal) on May 04, 2008 at 15:22 UTC
    You'll get an error if one of these variables contains a single quote '.

    To avoid that (and to boost performance) use placeholders instead:

    my $sth = $dbh->prepare("INSERT INTO ques(username, userque, ...) VALU +ES (?, ?, ...)"); foreach my $t (split '~', $txtque){ $sth->execute($usernamenow, $t, ...); }
Re: Inserting large chunks of data into MySQL
by talexb (Chancellor) on May 04, 2008 at 15:09 UTC
      when i try to split about 10,000 array of strings and try to insert into mysql - i get server error but ..

    OK, I give up -- what's the server error that you're getting? Knowing that will help us understand what the problem might be. And once we have an idea what the problem is, we can start to propose a solution.

    Update: It's really quite rude to update a root node without making it clear that you're doing so. If you're going to do that, mark it clearly as an UPDATE (as I've done). Don't just add stuff -- it makes the thread confusing for anyone who comes along later to read it.

    Alex / talexb / Toronto

    "Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

Re: Inserting large chunks of data into MySQL
by pc88mxer (Vicar) on May 04, 2008 at 15:18 UTC
    First of all, are you sure this code works? perltidy reports that it parses this way:
    ... $count = $dbh->do("INSERT INTO ques (...) VALUES (...)", undef, $usernamenow, $txtque, ... ); ...
    I think you want single quotes for the SQL statement and placeholders (?) like this:
    $count = $dbh->do('INSERT INTO ques (...) VALUES (?,?,?,?,...)', undef, $usernamenow, $txtque, ... );
    You use of double quotes and string interpolation (i.e. using $usernamenow inside the double-quoted string) will definitely cause you problems.
Re: Inserting large chunks of data into MySQL
by Fletch (Bishop) on May 04, 2008 at 15:19 UTC

    Vagueness aside, if this is a recurring need also consider reformatting your data into a format which the mysql bulk loader can handle (for instance CSV). That's more than likely going to run faster than any ad hoc DBI you whip up.

    The cake is a lie.
    The cake is a lie.
    The cake is a lie.

Re: Inserting large chunks of data into MySQL
by g0n (Priest) on May 04, 2008 at 21:28 UTC
    1) Please don't update your post by replacing the original content. It means that the answers people have already given no longer make sense. If you want to post an update, put 'update' at the bottom of your post and add more details below it, or something like that.

    2) Please give us an error message. The error you've posted tells us that you are having trouble with a CGI script, and that's about all. Run your script from the command line, not via a web browser, and report the error generated, and maybe someone can help.

    3) "here the problem wasn't from mysql command it was from perl it doesn't seems like to split string that is large over 20,000 arrays or about 5MB"

    Nope, sorry, you're making a false assumption about the problem. The Perl interpreter can handle split on much bigger strings than that. In fact, I ran the following very simple code to demonstrate that on my laptop PC

    use strict; use warnings; my $string=""; my %charmap = ( 0 => "a", 1 => "~" ); for (1..50000000) { $string .= $charmap{int(rand(2))}; } open (my $testfile,">","testfile.txt") or die $!; print $testfile $string; my @arr = split "~",$string; for (@arr) { print "$_\n"; }

    it took a while and slowed my laptop down, but it ran without a problem, and that was a string of approximately 50MB - ten times what you're working with.

    OK, so by commenting the sql line out you're on the right lines - try to narrow the problem down to the simplest case that still generates an error. But first, find out what the error is - run it locally and see what it does.

    Update You could also get useful error messages by putting:     use CGI::Carp qw(fatalsToBrowser); at the beginning of your script.

    --------------------------------------------------------------

    "If there is such a phenomenon as absolute evil, it consists in treating another human being as a thing."
    John Brunner, "The Shockwave Rider".

Re: Inserting large chunks of data into MySQL
by pc88mxer (Vicar) on May 04, 2008 at 15:17 UTC
    Update: Due to server lag, this node got duplicated and may be reaped.

    First of all, are you sure this code works? perltidy reports that it parses this way:

    ... $count = $dbh->do("INSERT INTO ques (...) VALUES (...)", undef, $usernamenow, $txtque, ... ); ...
Re: Inserting large chunks of data into MySQL
by chrism01 (Friar) on May 06, 2008 at 06:09 UTC
    In addition to other's comments, if you've got Autocommit=0 (ie off), then the MySQL buffers may fill up if you try to put too much in.
A reply falls below the community's threshold of quality. You may see it by logging in.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://684434]
Approved by ww
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others lurking in the Monastery: (3)
As of 2025-06-22 12:34 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.