Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Re: LWP::UserAgent & memory problems

by talexb (Chancellor)
on Oct 23, 2012 at 19:58 UTC ( [id://1000518]=note: print w/replies, xml ) Need Help??


in reply to LWP::UserAgent & memory problems

Could you just use curl to download the file instead? You don't always have to use Perl for stuff.

Alex / talexb / Toronto

"Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

Replies are listed 'Best First'.
Re^2: LWP::UserAgent & memory problems
by bulk88 (Priest) on Oct 23, 2012 at 23:40 UTC
    I agree. Perl is a glue language, nothing wrong with controlling through console streams a console app to do the heavy lifting for your script, I/O or CPU or memory wise. Some tasks are better to do in a C/C++ based program than in Perl, but nothing with using Perl to control the C program. I've found that console HTTP downloaders always work better and faster for me than LWP unless very complicated forms or non standard HTTP verbs are required.
Re^2: LWP::UserAgent & memory problems
by Uree (Acolyte) on Oct 24, 2012 at 09:22 UTC

    Oh, I probably should have mentioned that the files are later being parsed with XML::LibXML, using an hybrid pull parser - dom tree strategy.
    More Concretely, with XML::LibXML::Reader I implemented a pull parser and then, for every node (these type of nodes are dramatically littler than the whole XML dom tree) I load it into memory and get the data Im interested on with XML::LibXML::XPathContext.

    I apologize if omitting this turned out to be misleading.
    But, fact is the part of the code in charge of the parsing does work well and according to what I expect in terms of mem usage.

    Now, the part that indeed doesnt work as expect is the concrete piece of code of the original post (which I isolated into this single script, for testing purposes)
    The only omitted code there is an array containing the paths to the dl'd files which is being returned by the function and, also, a few more urls @ the urls array.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1000518]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others meditating upon the Monastery: (5)
As of 2024-03-19 08:18 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found