Come for the quick hacks, stay for the epiphanies. | |
PerlMonks |
Re: OUT OF MEMORY!by reyjrar (Hermit) |
on Aug 23, 2001 at 01:52 UTC ( [id://107162]=note: print w/replies, xml ) | Need Help?? |
I've experienced out of memory problems when code I've written
has gotten stuck in loops inifinitely or when I accidentally
pushed the contents of the same array onto itself multiple times.
A few issues I have with this code. You can't ultimately guarantee the size of any of these files, so its best to not read them all in at the same time, you can process the file chunk by chunk using the scalar <> or the read() or even sysopen,sysread if you want to get complicated. instead of:
try:
Another thing that bothers me about this code is that you're not cleaning everything up and you have multiple instances of those files in memory at any given time. This could be adding up.. in the @filenames lets say it takes up 128kb, then you join it together on $filename without destoroying the array, you've used 256kb. if you want to operate on the scalar, then you might want to undef the array @filenames=(); to save some memory. Also, if you're just gonna join things anyways, why not use the str concatenation operator like so:
which I believe can be shortened even more into:
anyways, play with it somemore. and use offline mode to do some debugging and but 'print' statements all over your loops, Your data set may contain something you haven't thought of and it sounds to me like the program is looping infinitely because of it. anyways, good luck with it, and let us know how it turns out. -brad..
In Section
Seekers of Perl Wisdom
|
|