
jeffersno1
Novice
Jul 23, 2012, 8:39 PM
Post #1 of 5
(2369 views)
|
ready files into array and collect from yesterday
|
Can't Post
|
|
Morning all, I'm starting to pull my hair out with a script that logs onto a production server looks for files for yesterday and copies them back into 1 file on the local machine, sounds easy huh. Now i want to store the file locally and then run a perl script to obtain stats from it. I can do all the stats part just cant gather the file(s). Most of the time there is only 1 file but when systems crash there are multiple. Here is my attempt
#!/usr/bin/perl use warnings; use POSIX 'strftime'; my $remo_main_dir = '/export/home/user'; my $loc_main_dir = '/home/system_prod'; my $logs_dir = "$remo_main_dir/prod/app_logs"; my $date = strftime("%Y%m%d", localtime(time - 86400)); print $date; @results1 = command_to_array("/usr/bin/ssh user\@prod1 '/bin/ls /$logs_dir/event_log_$date??????.file'"); foreach (@results1){ s/\/\//\//g; print "$_\n";} sub command_to_array ## given from a friend :) { my $command=shift; my $CMD=""; my @outdata=(); # print "CMD - {$command}\n"; open (CMD, "$command|") or die "FAILED to open file!"; while (<CMD>) { $line=$_; chomp($line); push(@outdata,$line); } close (CMD); return @outdata; } Now the print works well, i see files for yesterday buy cant copy them to my local machine. Is there an easier way to do this?, i think im making this much more complicated than it needs to be. Thanks in advance Jeffers
|