Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Re: Avoiding Out of Memory Problem

by FitTrend (Pilgrim)
on Mar 17, 2005 at 21:54 UTC ( #440558=note: print w/ replies, xml ) Need Help??


in reply to Avoiding Out of Memory Problem

Good point. However, if disk IO and/or time to complete the task is the issue, you may want to consider a hybrid system that processes larger files differently. You would need to find out what the sweet spot is to determine a large file from a small file. Then its simply a matter of stating the file size to determine which method to process it.

I too have a program that collects statistics and stores them into a hash. However, I routinely write the data to the hard drive on a set interval to avoid running out of memory. Based on what I've learned from my experiment, your hashes must be quite large to suck the memory from a machine.

This of course depends on your ultimate goal for your application.


Comment on Re: Avoiding Out of Memory Problem

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://440558]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others lurking in the Monastery: (9)
As of 2014-12-22 01:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (110 votes), past polls