Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Re: Speeding up the DBI

by diotalevi (Canon)
on Jul 14, 2003 at 13:16 UTC ( [id://273966]=note: print w/replies, xml ) Need Help??


in reply to Speeding up the DBI

I don't have time to comment just now but the comparison between the different fetch methods and data set size isn't all that appropriate for PostgreSQL. There you'd use ORDER BY/OFFSET/LIMIT instead of the $max_rows. You'd also contrive to use a cursor which means you'd probably eschew DBI and use Pg or Pg::Simple. That is, if you found yourself working with really large result sets.

The DBD::Pg driver always has PostgreSQL complete the entire SQL command and then copies the entire result over to perl. Trying to iterate with ->fetch doesn't help with that - you'd need a cursor (which isn't supported under DBI) or to reduce the data size.

Anyone else care to follow up with some concrete advice? I've got to head off to work just now.

Replies are listed 'Best First'.
Re: Re: Speeding up the DBI
by tantarbobus (Hermit) on Jul 14, 2003 at 19:32 UTC

    Yes it is. DBD::mysql behaves in the same way that DBD::Pg does when it comes to pulling all of the rows from the database server.

    If you want DBD::mysql to act as if it were using cursors, you can specify mysql_use_rows on the statement handle, and then it will not pull over the whole data set on execute.

    For DBD::Pg, you can declare a cursor and then execute the fetch in a loop until you have recieved all of the data. Admittedly this coould be done in a cleaner fashion where prepare could take an attribute and if prepare sees that attribute it will declare a cursor and rewrite your statement to use the cursor (which is what Pg::Simple does,IIRC)

      I had been under the impression that cursors don't work through DBD::Pg. I'm mistaken?

        $dbh->do(q{DECLARE imprecator CURSOR FOR SELECT * FROM tablename}) ; my $number = 10; my $fetch = $dbh->prepare(q{FETCH $number FROM imprecator}); ## ### Fetch first $number * 10 rows. ## while ($fetch->execute()) { Data::Dumper::Dumper($fetch->fetchall_arrayref()); last if $fetch->rows() < $number; }
        Note: The code above has not been tested; it is for illustrative purposes only.
      If you want DBD::mysql to act as if it were using cursors, you can specify mysql_use_rows on the statement handle, and then it will not pull over the whole data set on execute.
      I think you meant "mysql_use_result", some grief and head banging over that typo, so hopefully that'll help some wanderer around here also trying to speed up $sth->execute

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://273966]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (5)
As of 2024-03-28 19:34 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found