Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: Hash of Hashes from file

by scorpio17 (Monsignor)
on Apr 03, 2012 at 13:04 UTC ( #963238=note: print w/ replies, xml ) Need Help??


in reply to Hash of Hashes from file

You need a hash-of-arrays-of-hashes:

push( @{ $hoh{$user} }, { Website => $site, Category => $cat, });


Comment on Re: Hash of Hashes from file
Download Code
Re^2: Hash of Hashes from file
by cipher (Acolyte) on Apr 03, 2012 at 13:09 UTC
    Thanks for the quick update.I was avoiding arrays as the file size is too large, sometimes it goes up to 4-5 Gigs and if I use arrays script ends up running out of memory. I will try this on my file and will post the results.

      Ultimately if you want to allow more than one website per user, you're going to want an array layer in there: a hash of arrays of hashes. (You could replace the array layer with something functionally equivalent, like another hash layer, or Set::Object, but I see little point in doing that.)

      If you can't hold it all in memory, you're going to have to rethink your technique. Might it be possible to sort (or split) the file per-user, and then process the data one user at a time?

      perl -E'sub Monkey::do{say$_,for@_,do{($monkey=[caller(0)]->[3])=~s{::}{ }and$monkey}}"Monkey say"->Monkey::do'
        Before trying hashes, I was using arrays in my script where I was mapping out the unique usernames, then I was grepping each username and pushing results in a new array and then process data for user1, reinitialize array, grep and process data for user2. script ended up having too many foreach loops.

        I was trying to use hash as database and it looks it does not work that way

        Before trying hashes, I was using arrays in my script where I was mapping out the unique usernames, then I was grepping each username and pushing results in a new array and then process data for user1, reinitialize array, grep and process data for user2. script ended up having too many foreach loops. I was trying to use hash as database and it looks it does not work that way
      Well, the problem is that with a hash, you can only have one value for each key. If you need to associate multiple values per key, then the solution is to store the values in an array, and save the array reference in the hash. If you get to the point where you have more data than you can fit into memory at one time, then you need to look at using a database, like mysql. Instead of pushing the data from each line of the file into your hash, you would insert it into the database, then once all the data is loaded you can query the database.
        Yes hashes have unique keys, Is it possible to generate a hash like this:
        %hoh=( $user => { 'Website' => [website1,website2,website3], 'type' => [type +1,type2,type3]} );
        As I said I am not familiar with hashes, just checking.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://963238]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others meditating upon the Monastery: (10)
As of 2014-10-20 13:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    For retirement, I am banking on:










    Results (76 votes), past polls