more useful options | |
PerlMonks |
Re: Out Of Memory error at 950MB with 14GB free RAMby mattr (Curate) |
on Feb 10, 2004 at 17:05 UTC ( [id://327966]=note: print w/replies, xml ) | Need Help?? |
Does your program give correct results on a smaller dataset? (i.e. have you used unit tests in development?) I think this is the most important question. Assuming the algorithm works and is not reducible to a flatter, less pathological data structure..
It also looks like you are only maxing a single cpu. So you could put another cpu or so to work virtualizing your data structure to a database or ramdisk / ram cache that could be more easily shared. But like people are saying, perl can chew memory 2 or 3 times size but you report more than what people generally get I think. And, if there's a good reason for the memory snarfing like I dunno, using ascii or maybe a relatively small number of substructures that are repeated often, then you can probably trade cycles for space using packing, serializing, and indexing shortcuts. I'm thinking about a certain professor I once listened to, who apparently came up with an immensely fast genome pattern matcher, part due to the algorithm and part due to having gues who could use lots of intelligent programming tricks. So you might put some more time into thinking about implementation, and before that about how the problem could be mathematically reduced. You might consider sharing a little of the problem with us or anyway consider how the problem might translate to a database, or otherwise spend a long time compiling a data structure that can be quickly searched afterwards.
By the way! I am also interested in:
In Section
Seekers of Perl Wisdom
|
|