Beefy Boxes and Bandwidth Generously Provided by pair Networks
Clear questions and runnable code
get the best and fastest answer

Re: Posgres batch read with DBI?

by Corion (Pope)
on Jan 27, 2021 at 07:39 UTC ( #11127509=note: print w/replies, xml ) Need Help??

in reply to Posgres batch read with DBI?

There also is Data::Stream::Bulk, which fetches parts of the results. It has the advantage that you don't need to rewrite your SQL, but you still need to create a loop that fetches and processes the results. Also, as you mention transactions, that module will not help you with transactions growing too large, as it will keep the transaction open and only fetch a slice from the transaction for processing.

Personally, when doing long-running SQL stuff, I prefer to have it stateless in the SQL and limit the amount of data returned by the SQL like this:

select top 1000 foo , bar , baz from my_table where baz > ? order by foo, bar

Depending on the sort criteria, I like to order by some timestamp, so I either upgrade the oldest or the newest rows first.

Depending on your flavour of SQL, TOP 1000 needs to be replaced by LIMIT 1000 after the WHERE clause.

Replies are listed 'Best First'.
A reply falls below the community's threshold of quality. You may see it by logging in.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://11127509]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others lurking in the Monastery: (5)
As of 2021-04-14 23:01 GMT
Find Nodes?
    Voting Booth?

    No recent polls found