http://www.perlmonks.org?node_id=209044

Tanalis has asked for the wisdom of the Perl Monks concerning the following question:

Monks,

I'm in the middle of writing a script to handle a large volume of data, and populate a series of database tables with subsets of that data.

I've been using Data::Dumper to output the data at periods such that I could validate it's consistancy with what I expect to have been pulled in.

Now, I'm reading some 16,000 data items, and hashing them on a variety of keys such that I end up with a 3-level-deep hash of hashes: %hash -> $key1 -> $key2 -> $key3 -> values.
The final data values are largely scalars, except one, which is an array of 50 data values for the record for a series of potential situations that we have to store data on.

I have attempted to output, using print Dumper \%hash, both the entire hash, and subsets of the hash (even down to an individual record), but it appears that Data::Dumper simply can't cope with the structure of the data. The script locks, waiting for the Dumper to return, and basically just sits there, eating system resources until the box locks up.

Data::Dumper works fine if I prevent the final array of values from being added to the hash. I have gone ahead and added the array data to the database, and validated it that way, so I am 100% certain that the array is being assigned correctly to the hash ($hash{$key1}{$key2}{$key3} = \@array;).

Basically, I'm interested to know if anyone else has experienced a simliar issue with Data::Dumper and very deep HOHs and HOHOA. If anyone's aware of a workaround, that'd be very useful too - it's vital I have some convenient method of data validation.

I'd like to post some code regarding the population of the hash, but it's huge (750 lines+) and I don't see the necessity. I'm certain the hash is populated correctly from the database; I'd just welcome a way of validating that data before it gets to the DB, which is what Data::Dumper was previously being used for.

Cheers ..
-- Foxcub