jens has asked for the wisdom of the Perl Monks concerning the following question:
I'm doing a bit of data munging for a client and
I want to suck around 6,000 records from a flat file
into a large hash of hashes. I've RTFM'd and I'm still
struggling--your help would be much appreciated.
Here's what I have so far:
Please help!
--
Microsoft delendum est.
I want to suck around 6,000 records from a flat file
into a large hash of hashes. I've RTFM'd and I'm still
struggling--your help would be much appreciated.
Here's what I have so far:
I've also considered using an array of hashes, but that's also confused me.my %hash_of_hashes = ( my $record_no => my %unitfiles_hash ); #suck all the unit files into a big hash to make searching easier while (<UNITFILES>) { my @unitfiles_field = split /,/; #this is just an autogen number $record = $unitfiles_field[0]; %hash_of_hashes{$record_no} = ( %unitfiles_hash = ( lastname => $unitfiles_field[1], firstname => $unitfiles_field[2], DOB => $unitfiles_field[7], funding => $unitfiles_field[18], URNo => $unitfiles_field[15], Photo_permission => $unitfiles_field[14], ); ); } #end while
Please help!
--
Microsoft delendum est.
Back to
Seekers of Perl Wisdom