http://www.perlmonks.org?node_id=967544

sophate has asked for the wisdom of the Perl Monks concerning the following question:

Dear Monks,

I have a huge database table which is over 30GB. I would like to create a Perl script that reads the table row by row. Process the data after reading each row.

I've written the script below but I run into a capacity issue right away as the script eats up all the memory when it tries to store the whole table into an array. Here are the codes.

#!/usr/bin/perl use DBI; $ENV{'FREETDSCONF'} = '/home/test/config/freetds.conf'; $ENV{'SYBASE'} = '/home/test/sql_server_15_0-64bit'; my $SERVER = 'TEST'; my $DB = 'test'; my $USER = 'admin'; my $PASSWD = 'admin'; $DB_Ref = DBI->connect("DBI:Sybase:server=$SERVER;database=$DB", "$USER", "$PASSWD", {syb_err_handler => \&syb_err_h}) or die "Couldn't connect to $DB : " . DBI->errstr; $DB_Ref ->{RaiseError} = 1; $DB_Ref ->{PrintError} = 1; $DB_Ref ->do("set dateformat ymd"); ## The table testTable is over 30GB ## The line below eats up all RAM on the server my $testArray = $DB_Ref->selectall_arrayref('select * from testTable') +; ## Do something on each row here

Can I output the results of the query one row at a time? Something like an input stream so that I can process the results row by row with something like "while <QUERY_STREAM> {}" to save memory? Many thanks.