Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW

Re: Gargantuan memory consumption issues

by Corion (Pope)
on Dec 22, 2008 at 23:12 UTC ( #732216=note: print w/replies, xml ) Need Help??

in reply to Gargantuan memory consumption issues

You might be unaware that WWW::Mechanize keeps a history of visited pages. If your script never quits and only ever keeps one WWW::Mechanize object around, that will accumulate a vast history over time, which will also consume more and more memory.

Of course, without seeing code, it's hard to tell.

  • Comment on Re: Gargantuan memory consumption issues

Replies are listed 'Best First'.
Re^2: Gargantuan memory consumption issues
by runrig (Abbot) on Dec 22, 2008 at 23:53 UTC
    This is the likely culprit. Either set stack_depth() on the Mech object, or periodically destroy the object and create a new one.
      Thank you , this was indeed the problem
Re^2: Gargantuan memory consumption issues
by p2409 (Initiate) on Dec 23, 2008 at 07:04 UTC
    +1 on that one. Mechanize is great, but a hog.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://732216]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (5)
As of 2016-10-26 06:16 GMT
Find Nodes?
    Voting Booth?
    How many different varieties (color, size, etc) of socks do you have in your sock drawer?

    Results (336 votes). Check out past polls.