CGI/Perl Guide | Learning Center | Forums | Advertise | Login
Site Search: in

  Main Index MAIN
INDEX
Search Posts SEARCH
POSTS
Who's Online WHO'S
ONLINE
Log in LOG
IN

Home: Perl Programming Help: Beginner: Download a list of files from web... faster?: Edit Log



smilebey
Novice

Oct 9, 2013, 4:11 AM


Views: 2754
Download a list of files from web... faster?

Hello Perl Community,

Currently I am trying to improve my code for downloading textual files from a website. This is the code:

Code
foreach $line (@file) {  
#Each line has following struture: did, filename, blank (blank included because it will capture the newline)
($did, $get_file, $blank) = split (",", $line);
$get_file = "http://www.arandomwebsite.com/Archives/" . $get_file;
$_ = $get_file;

if ( /([0-9|-]+).txt/ ) {
$filename = $write_dir . "/" . $did . ".txt";
open OUT, ">$filename" or die $!;
print "file $did \n";

my $request = HTTP::Request->new(GET => $get_file);
my $response =$ua->get($get_file );
$p = $response->content;
if ($p) {
print OUT $p;
close OUT;
} else {
#error logging
print LOG "error in $filename - $did \n" ;
}
}
}

My question is: Is there any way to improve (speed up) the downloading process? The loop, the general structure, or even the command?

I appreciate any comments and help. Thanks in advance.

smilebey


(This post was edited by smilebey on Oct 9, 2013, 4:12 AM)


Edit Log:
Post edited by smilebey (Novice) on Oct 9, 2013, 4:11 AM
Post edited by smilebey (Novice) on Oct 9, 2013, 4:12 AM


Search for (options) Powered by Gossamer Forum v.1.2.0

Web Applications & Managed Hosting Powered by Gossamer Threads
Visit our Mailing List Archives