My system has 4 GB RAM, 250GB HDD. Debian etch OS, PostgreSQL 8.1, perl 5.8.8, perl DBI 1.52
my query may return more than 3000000 records and it is build upon three tables using multiple joins and conditions.
when i execute this query from perl program(which uses perl DBI), it is taking more time to return the records, and when it returns the memory is completely eaten up by the perl process.
so I decided to get the records patch by patch continuously.
a patch is basically 1000.
I did a vast search on the net and found the following methods were useful
- using bind_columns (but more method calls are needed).
- using offset and limit in my query itself, (but internally the server may get all the records and move to the offset then return to us).
- using cursors of postgreSQL.( I don't know much about its efficiency).
I am confused now, what to use among the above choices?,
Does any other better ways exist? If so, please help me to get out of this problem.