Beefy Boxes and Bandwidth Generously Provided by pair Networks Bob
Do you know where your variables are?
 
PerlMonks  

RE: to post, or not to post...

by merlyn (Sage)
on Oct 12, 2000 at 02:22 UTC ( [id://36351]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to to post, or not to post...

Why not just use the "callback" parameter of request or simple_request, and grab the data as it comes back?

Seems like you've reinvented a pretty big wheel. {grin}

From perldoc LWP::UserAgent...

The subroutine variant requires a reference to callback routine as the second argument to request() and it can also take an optional chuck size as the third argument. This variant can be used to construct "pipe-lined" pro- cessing, where processing of received chuncks can begin before the complete data has arrived. The callback func- tion is called with 3 arguments: the data received this time, a reference to the response object and a reference to the protocol object. The response object returned from request() will have empty content. If the request fails, then the the callback routine is called, and the response->content might not be empty. The request can be aborted by calling die() in the call- back routine. The die message will be available as the "X-Died" special response header field. The library also allows you to use a subroutine reference as content in the request object. This subroutine should return the content (possibly in pieces) when called. It should return an empty string when there is no more con- tent.

-- Randal L. Schwartz, Perl hacker

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://36351]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.