
wickedxter
User
Oct 25, 2012, 1:15 PM
Post #2 of 6
(3440 views)
|
Re: [dilbert] issues with Mozrepl running wiht WWW::Mechanize::Firefox
[In reply to]
|
Can't Post
|
|
This works for me and is fast I also added Parallel::ForkManager if you need to use it uncomment the things out. This used about 18mb of memeory and 40% over 4 cores useing ubuntu 12.04 and perl 5.14.2
#!/usr/bin/perl use WWW::Mechanize::Firefox; use strict; use warnings; #use Parallel::ForkManager; #my $fork = Parallel::ForkManager->new(2); #sites my @urls = qw(http://www.google.com http://www.yahoo.com http://www.cnn.com http://www.bing.com http://www.nbcnews.com/); #temp base dir my $temp = '/home/aaron/cgi-bin/'; for my $each (@urls){ #$fork->start and next; my $mech = WWW::Mechanize::Firefox->new(launch => 'firefox',create => 1,); $each =~ /www\.(\w+)\.com/; my $name = $1; print "creating $name.png\n"; $mech->get($each); my $png = $mech->content_as_png(undef, undef, {width => 240,height => 240}); my $dir_name = "$temp/$name".".png"; open my $file, ">", "$dir_name" or die "couldnt create $temp/$1.png"; binmode $file; print {$file} $png; close $file; sleep 5; # sleep some to give it a little time to make sure things compleated..... you'll need this more using fork #$fork->finish; } print "Well All done!\n"; #$fork->wait_all_children;
(This post was edited by wickedxter on Oct 25, 2012, 1:57 PM)
|