Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Storable: where is my memory?

by ph0enix (Friar)
on Dec 12, 2002 at 17:31 UTC ( [id://219382]=perlquestion: print w/replies, xml ) Need Help??

ph0enix has asked for the wisdom of the Perl Monks concerning the following question:

Working on one project I stay before problem of memory requirements. Tried to use tied hashes I doscover taht my code eat more and more of my memory. Each iteration through all data in that hash cost me other memory space. After some experiments I've got following code which demonstrate this behaviour. It looks like problem with Storable.

What goes on? Where is my memory?

#!/usr/bin/perl -w use strict; use Set::Scalar; use Storable qw(nfreeze thaw); my $iterations = 20; my $set_count = 100; my $data = 'abcdefgh'; srand(); for (my $cycle = 0; $cycle < $iterations; $cycle ++) { { my $tmp = $data; my $r_arr; $tmp .= chr(int(rand(26))+65); for (my $set = 0; $set < $set_count; $set ++) { my $scalar_set = Set::Scalar->new(split(//, $tmp)); push @$r_arr, thaw( nfreeze( $scalar_set ) ); } } &procinfo; } exit 0; sub procinfo { my @stat; open( STAT , '<', "/proc/$$/stat" ) or die "Unable to open stat file +"; @stat = split /\s+/ , <STAT>; close( STAT ); printf "Vsize: %3.2f MB\t", $stat[22]/1048576; print "RSS : $stat[23] pages\n"; }

Replies are listed 'Best First'.
Re: Storable: where is my memory?
by djantzen (Priest) on Dec 12, 2002 at 17:59 UTC

    Why are you doing an nfreeze and immediate thaw on your data? From the docs: Note that freezing an object structure and immediately thawing it actually achieves a deep cloning of that structure.

    I believe you're seeing greater memory consumption because you are creating totally distinct copies of each instance of Set::Scalar.

Re: Storable: where is my memory?
by iburrell (Chaplain) on Dec 12, 2002 at 21:52 UTC
    You are making lots (2000) of Set::Scalar objects. Then duplicating them needlessly with Storable. Storable probably uses lots of temporary structures. Not to mention the overhead of all the little scalars.

    Why are you deep-copying the Set::Scalar object? Why are you using Storable when Set::Scalar includes copy() and clone() methods?

      Well. This piece of code should simulate what goes on when tied hash is used. I have tied hash to PostgreSQL database so key-value pairs are stored in database and does not reside in memory. Because stored values are deeper structures (in most simpliest case the stored value is list of Set::Scalar objects) there is the Storable module used for converting to the scalar.

      There is nfreeze called when storing value to the database and thaw when data retrieval from it. I expect that all temporary variables are freed when goes out of scope, but it looks like not. procinfo function is called when all variables ($tmp, $r_arr, $set and $scalar_set) are out of their scope and so I expect that displayed values should be the same...

      My big problem is memory compsumption when I iterate through all keys in this tied hash. For relatively small piece of test data the first iteration cost about 750 MB and each other cycle causes perl to require about 200 MB more! So after third cycle is allocated more than 1GB!!!

      Simple empty loop with this tied hash which causes memory compsumption...

      while (my ($key, $value) = each %data) { ; }

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://219382]
Approved by valdez
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others studying the Monastery: (3)
As of 2024-04-19 05:17 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found