Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re: Gargantuan memory consumption issues

by Corion (Pope)
on Dec 22, 2008 at 23:12 UTC ( #732216=note: print w/ replies, xml ) Need Help??


in reply to Gargantuan memory consumption issues

You might be unaware that WWW::Mechanize keeps a history of visited pages. If your script never quits and only ever keeps one WWW::Mechanize object around, that will accumulate a vast history over time, which will also consume more and more memory.

Of course, without seeing code, it's hard to tell.


Comment on Re: Gargantuan memory consumption issues
Re^2: Gargantuan memory consumption issues
by runrig (Abbot) on Dec 22, 2008 at 23:53 UTC
    This is the likely culprit. Either set stack_depth() on the Mech object, or periodically destroy the object and create a new one.
      Thank you , this was indeed the problem
Re^2: Gargantuan memory consumption issues
by p2409 (Initiate) on Dec 23, 2008 at 07:04 UTC
    +1 on that one. Mechanize is great, but a hog.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://732216]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others examining the Monastery: (9)
As of 2015-07-06 07:59 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The top three priorities of my open tasks are (in descending order of likelihood to be worked on) ...









    Results (70 votes), past polls