Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
I do not have a lot of experience w/ perl IO or LWP, WWW::Curl and POE::Component::Client::HTTP but I am looking for a way to stream a file from one server to another.
I can acomplish it with curl but I will be copying millions of files and I am trying to avoid the overhead of all the extra forks. Plus I will be threading the whole thing using a Boss/Worker model.
Basically I'm trying to duplicate the following all in perl.
curl http://server1/path/file.dat | curl --upload-file - http://server2/upload/file.dat
The trick is the streaming part. I can get LWP and WWW::Curl::Easy to GET/PUT files from disk, but I really need to stream them. I will be moving some large files and can't afford to GET then PUT a file. Plus the disk IO may reduce my overall throughput. Holding the content in memory is also out. I really do need to stream... That's where I'm stuck.
I can probably patch something together w/ WWW::Curl::Multi but as I was looking around there was some suggestion to use POE::Component::Client::HTTP instead of WWW::Curl::Multi for someone else's task.
Regards,
-Alan
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Stream file from one HTTP server to another with HTTP GET and PUT requests
by thewebsi (Scribe) on Nov 16, 2012 at 06:00 UTC | |
Re: Stream file from one HTTP server to another with HTTP GET and PUT requests (can't use AnyEvent::HTTP)
by Anonymous Monk on Nov 16, 2012 at 11:18 UTC | |
by Anonymous Monk on Nov 16, 2012 at 11:25 UTC |