Come for the quick hacks, stay for the epiphanies. | |
PerlMonks |
Get 10,000 web pages fastby Mad_Mac (Beadle) |
on Jun 17, 2010 at 11:52 UTC ( [id://845186]=perlquestion: print w/replies, xml ) | Need Help?? |
Mad_Mac has asked for the wisdom of the Perl Monks concerning the following question: I have a list of 10,000~ URLS, stored in a hash with a user friendly name, that I need to retrieve from a webserver, for local parsing and analysis. My code seems to have a memory leak, and eventually cannot fork because it runs out of resources. Here's the relevant bit of my code:
I thought of trying to use LWP::Parallel, but it doesn't want to install on my system. If it matters, I am doing this with Strawberry Perl in a Win x32 VM on a Linux Mint X64 host. I'm not sure exactly which version of Perl. The msi from Strawberrys site says 5.12.1, but perl -ver says 5.10.1. The host has 8GB RAM, and I have allocated 4GB to the Win7 VM. The Perl process starts out using ~600 MB and creeps up to ~2GB before it crashes (sometimes sooner). So, my questions are: Thanks
Back to
Seekers of Perl Wisdom
|
|