Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask
 
PerlMonks  

Re: Loading large amount of data in a hash

by maverick (Curate)
on May 01, 2002 at 22:22 UTC ( #163413=note: print w/ replies, xml ) Need Help??


in reply to Loading large amount of data in a hash

Right idea. You just have a few little details. You can't natively store references in dbms. You'll have to serialize the arrayrefs down to scalars before you can store them, using something like Storable. Which leads to the second issue. If memory serves, GDBM has a fixed size for those scalars...Berkely BTrees do not. Try something like:

use DB_File; use Storable; tie %data_parsed, "DB_File", "$hashfil", O_RDWR|O_CREAT, 0666; # inside read data loop $data_parsed{$key} = freeze($array_ref); # inside use data loop $array_ref = thaw($data_parsed{$key});
DB_File is slow to create, but fast to read. That may or may not be an issue.

HTH

Update

Ya know...after taking another glance at this, 800 is a lot of data. Plus, you have a key, and array combination. Perhaps it's time to move up to a full blown database like MySQL or Postgres?

/\/\averick
OmG! They killed tilly! You *bleep*!!


Comment on Re: Loading large amount of data in a hash
Download Code

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://163413]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others contemplating the Monastery: (15)
As of 2014-07-31 18:15 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (250 votes), past polls