Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked

Re: LWP::UserAgent & memory problems

by talexb (Canon)
on Oct 23, 2012 at 19:58 UTC ( #1000518=note: print w/ replies, xml ) Need Help??

in reply to LWP::UserAgent & memory problems

Could you just use curl to download the file instead? You don't always have to use Perl for stuff.

Alex / talexb / Toronto

"Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

Comment on Re: LWP::UserAgent & memory problems
Download Code
Re^2: LWP::UserAgent & memory problems
by bulk88 (Priest) on Oct 23, 2012 at 23:40 UTC
    I agree. Perl is a glue language, nothing wrong with controlling through console streams a console app to do the heavy lifting for your script, I/O or CPU or memory wise. Some tasks are better to do in a C/C++ based program than in Perl, but nothing with using Perl to control the C program. I've found that console HTTP downloaders always work better and faster for me than LWP unless very complicated forms or non standard HTTP verbs are required.
Re^2: LWP::UserAgent & memory problems
by Uree (Acolyte) on Oct 24, 2012 at 09:22 UTC

    Oh, I probably should have mentioned that the files are later being parsed with XML::LibXML, using an hybrid pull parser - dom tree strategy.
    More Concretely, with XML::LibXML::Reader I implemented a pull parser and then, for every node (these type of nodes are dramatically littler than the whole XML dom tree) I load it into memory and get the data Im interested on with XML::LibXML::XPathContext.

    I apologize if omitting this turned out to be misleading.
    But, fact is the part of the code in charge of the parsing does work well and according to what I expect in terms of mem usage.

    Now, the part that indeed doesnt work as expect is the concrete piece of code of the original post (which I isolated into this single script, for testing purposes)
    The only omitted code there is an array containing the paths to the dl'd files which is being returned by the function and, also, a few more urls @ the urls array.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1000518]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others wandering the Monastery: (7)
As of 2014-09-16 09:56 GMT
Find Nodes?
    Voting Booth?

    My favorite cookbook is:

    Results (158 votes), past polls