Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

DBI Process.

by santhosh_89 (Scribe)
on Sep 05, 2009 at 06:01 UTC ( #793658=perlquestion: print w/ replies, xml ) Need Help??
santhosh_89 has asked for the wisdom of the Perl Monks concerning the following question:

Hi ! Monks

I have written a perl programming to load the data from postgres database.My table has more than one crore data,I wanted to load the data in array references,What it will do? will it cause any memory related problem if it has large number of data,below i have pasted the source code.I have practised for minimum data,I wanted to know while fetching more than one crore data what it will do.

#!/usr/bin/perl use strict; use warnings; use DBI; use Data::Dumper; my $dbh = DBI->connect( 'dbi:Pg:database=san', 'san','psql') or die "Can't connect to " ; my $stmt="Select * from public.log_table"; my $ref = $dbh->selectall_arrayref($stmt) ; print Dumper($ref)

Comment on DBI Process.
Download Code
Re: DBI Process.
by ikegami (Pope) on Sep 05, 2009 at 06:08 UTC

    will it cause any memory related problem if it has large number of data

    I guess that depends on your definition of large.

Re: DBI Process.
by Sewi (Friar) on Sep 05, 2009 at 10:13 UTC
    Sorry, but I don't see any unanswered questions in your post.

    What it will do?
    Your answer: print Dumper($ref)
    Assuming that you wrote at lease two test records to your table, Dumper will show you a more exact answer than anybody could write here.

    will it cause any memory related problem if it has large number of data
    Oh, sorry, this sounds like a question, but the answer is (a little bit hidden) also in your post:
    selectall_arrayref
    This fetches all data and puts it into arrays. Typical arrays are stored in RAM by Perl, so the answer is yes - if your data including Perl's array overhead is larger than your RAM, you'll be in trouble.

    Look at the DBI manpage, POD or CPAN page, there is a pretty good example for fetching a huge amount of data while with little memory using prepare/execute/fetch*. Whenever you're thinking about will it cause any memory related problem if it has large number of data, this way will be your friend.
    If you really really really need the data as ONE per structure (String, Array, Hash), consider tie'ing it to a temporary file. This will make things slower but the limit will be your free disk space and not your free memory.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://793658]
Approved by ikegami
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others wandering the Monastery: (7)
As of 2014-08-02 02:01 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Who would be the most fun to work for?















    Results (53 votes), past polls