Beefy Boxes and Bandwidth Generously Provided by pair Networks
go ahead... be a heretic
 
PerlMonks  

Re: Gargantuan memory consumption issues

by Corion (Pope)
on Dec 22, 2008 at 23:12 UTC ( #732216=note: print w/ replies, xml ) Need Help??


in reply to Gargantuan memory consumption issues

You might be unaware that WWW::Mechanize keeps a history of visited pages. If your script never quits and only ever keeps one WWW::Mechanize object around, that will accumulate a vast history over time, which will also consume more and more memory.

Of course, without seeing code, it's hard to tell.

  • Comment on Re: Gargantuan memory consumption issues

Replies are listed 'Best First'.
Re^2: Gargantuan memory consumption issues
by runrig (Abbot) on Dec 22, 2008 at 23:53 UTC
    This is the likely culprit. Either set stack_depth() on the Mech object, or periodically destroy the object and create a new one.
      Thank you , this was indeed the problem
Re^2: Gargantuan memory consumption issues
by p2409 (Initiate) on Dec 23, 2008 at 07:04 UTC
    +1 on that one. Mechanize is great, but a hog.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://732216]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others browsing the Monastery: (2)
As of 2016-06-25 01:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    My preferred method of making French fries (chips) is in a ...











    Results (322 votes). Check out past polls.