http://www.perlmonks.org?node_id=1001078

Perlbeginner1 has asked for the wisdom of the Perl Monks concerning the following question:

g day monks

tryin to get this to work


do i need a 4 Core processor!?`


why - well i am guessing that my machine is bit too solow hmmm i added Parallel::ForkManager -


But if you do so - you need a blisteringly fast machine it uses lots of mb of memeory and a bunch of the core running opensuse 12.2 and perl 5.14.2


my machine is too slow - whats if i want to run this with lemme say 200 URLS (or websites) do i need a 4 Core processor!?`
#!/usr/bin/perl use WWW::Mechanize::Firefox; use strict; use warnings; #use Parallel::ForkManager; #my $fork = Parallel::ForkManager->new(25); #sites my @urls = qw(http://www.google.com http://www.yahoo.com http://www.cn +n.com http://www.bing.com http://www.nbcnews.com/... and thosusand others more ); #temp base dir my $temp = '/home/myspace/cgi-bin/'; for my $each (@urls){ #$fork->start and next; my $mech = WWW::Mechanize::Firefox->new(launch => 'firefox',create + => 1,); $each =~ /www\.(\w+)\.com/; my $name = $1; print "creating $name.png\n"; $mech->get($each); my $png = $mech->content_as_png(undef, undef, {width => 240,height + => 240}); my $dir_name = "$temp/$name".".png"; open my $file, ">", "$dir_name" or die "couldnt create $temp/$1.pn +g"; binmode $file; print {$file} $png; close $file; sleep 5; # sleep some to give it a little time to make sure things + compleated..... you'll need this more using fork #$fork->finish; } print "Well All done!\n"; #$fork->wait_all_children;