For RSS feed, there will be little or no content to cache, so I'd see this approach as a lot of work for uncertain benefit.
in reply to Predictive HTTP caching in Perl
Something that will work is parallelizing the retrieval of the pages/feeds. Create an application, say with Parallel::ForkManager, that creates multiple process, each one fetching one site and processing it. Then assemble the results from all the children into your composite feed. The time taken will be only a little longer than the slowest website/feed.