http://www.perlmonks.org?node_id=1064244


in reply to Is it possible to run SQL select in Oracle and SQL Server to get a large recordset and import it straight into an array, looking to avoid using rownext

Also ... what immediately pops into my head is this:

“Gee, ’6000 to 8000 rows’ isn’t that much data ... not at all.   Especially not for Oracle.   Therefore, why is it ‘taking forever?’   Perhaps this is the true root-cause of the problem.   I smell a dead mouse somewhere ...

Sure, there are DBI routines that will “fetch an array,” but if there is some kind of bottleneck going on here, those would probably take forever, too.   Consider therefore that there might be something to do with your algorithm, with the exact way that you are connecting to Oracle and so-on, which might be causing an extreme but altogether-unjustified slowdown.

Great example of this (from another context):   I was using SQLite basically for the first time, and it was dog slow.   Knowing SQLite’s reputation for speed, I didn’t know what I was doing wrong ... until I saw that disk I/O was going through-the-roof for this process ... especially writes.   It turns out that, with this system, you basically must use transactions, even for reads.   So I made this change, and whoosh!!   No amount of “diddling with my code” would have made any difference at all.   Perhaps a similar thing is going on here ... and, mind you, it could in this case very well have something to do with transactions, isolation-levels / repeatable-read, and so forth.   Oracle, like all pieces of complex software, “makes certain assumptions,” such that you might not know that they are happening but that might have a profound impact on your software’s performance.