Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Re: Avoiding Out of Memory Problem

by FitTrend (Pilgrim)
on Mar 17, 2005 at 21:54 UTC ( [id://440558]=note: print w/replies, xml ) Need Help??


in reply to Avoiding Out of Memory Problem

Good point. However, if disk IO and/or time to complete the task is the issue, you may want to consider a hybrid system that processes larger files differently. You would need to find out what the sweet spot is to determine a large file from a small file. Then its simply a matter of stating the file size to determine which method to process it.

I too have a program that collects statistics and stores them into a hash. However, I routinely write the data to the hard drive on a set interval to avoid running out of memory. Based on what I've learned from my experiment, your hashes must be quite large to suck the memory from a machine.

This of course depends on your ultimate goal for your application.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://440558]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others about the Monastery: (3)
As of 2024-04-25 19:34 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found