Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change

Re: Avoiding Out of Memory Problem

by FitTrend (Pilgrim)
on Mar 17, 2005 at 21:54 UTC ( #440558=note: print w/replies, xml ) Need Help??

in reply to Avoiding Out of Memory Problem

Good point. However, if disk IO and/or time to complete the task is the issue, you may want to consider a hybrid system that processes larger files differently. You would need to find out what the sweet spot is to determine a large file from a small file. Then its simply a matter of stating the file size to determine which method to process it.

I too have a program that collects statistics and stores them into a hash. However, I routinely write the data to the hard drive on a set interval to avoid running out of memory. Based on what I've learned from my experiment, your hashes must be quite large to suck the memory from a machine.

This of course depends on your ultimate goal for your application.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://440558]
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others imbibing at the Monastery: (9)
As of 2018-06-18 19:56 GMT
Find Nodes?
    Voting Booth?
    Should cpanminus be part of the standard Perl release?

    Results (110 votes). Check out past polls.