in reply to Predictive HTTP caching in Perl
For RSS feed, there will be little or no content to cache, so I'd see this approach as a lot of work for uncertain benefit.
Something that will work is parallelizing the retrieval of the pages/feeds. Create an application, say with Parallel::ForkManager, that creates multiple process, each one fetching one site and processing it. Then assemble the results from all the children into your composite feed. The time taken will be only a little longer than the slowest website/feed.
-Mark
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: Predictive HTTP caching in Perl
by ryantate (Friar) on May 03, 2006 at 05:57 UTC |
In Section
Seekers of Perl Wisdom