http://www.perlmonks.org?node_id=433573


in reply to In need of a Dumper that has no pretentions to being anything else.

Remember, many people do like Data::Dumper because it does presume to be a serialization standard, albeit in text form. It is mission critical for a lot of small glue applications out there.

If I read fergal's patch right, then you'd be removing the circular reference protection for everyone who turns off Deepcopy. Just because BrowserUk doesn't break up his data for analysis doesn't mean everyone should have a new behavior here. DD should still protect the iterator from cycling around forever, even if it doesn't produce reference syntax in its output.

I implemented a fourth format style to Data::Dumper once, which tries to fit whole arrays on one line if possible, or word-wrap the elements of arrays on a minimum number of lines if the elements are non-references.

But two bad things happened: (1) I lost that patch at some point, and (2) the stock DD is now not written in Perl but in native code. Changing the Perl version is a waste of effort.

I think the answer here is not to wade through 500_000 elements written in Perl syntax, looking for a programmatic error on your part. Instead, formulate some theories as to the fault and analyze the structure with a couple of lines of Perl, or find a smaller dataset which exhibits the fault.

Also, turn on Terse, turn off Purity, and consider overloading the string-izing operator for certain objects so you don't get so much bless(do{...},'Math::Pari') noise.

--
[ e d @ h a l l e y . c c ]