
pnovice
New User
May 19, 2011, 2:48 PM
Post #5 of 6
(15290 views)
|
Thanks Miller, i have verified that your method is about 5X faster. I had to eliminate several IO operation to measure just this piece of code. Thanks. Here is my machine info OS: Linux 2.6.9-89.ELsmp x86_64 Model: Dual-Core AMD Opteron(tm) Processor 8220 SE CPU: 8 @ 2813.195MHz Memory: Ram: 31GB Swap: 78GB /tmp: 175GB The script will go out to several other users and its hard to predict what they will run on, so I had to put the best code out. Can you help me, if there is a speed up possibility in these other pieces of code? They all work correctly. Section 1 #reading input data (Could be well over couple of GB) memory not an issue. print "\tReading input file $fil ...\n" ; $cnt = 0 ; while (<INP>) { if ($_ =~ /(\S+)\s+(\S+)/) { $ref = sprintf("%10.6f", $1) ; $val = sprintf("%10.6f", $2) ; $err = $val-$ref ; $err = sprintf("%10.6f", $err) ; if ($ref!=0) { $pct = ($err/$ref)*100.0 ; $pct = sprintf("%.2f", $pct) ; } else { $pct = 0 ; } $data_array[$cnt++] = "$ref $val $err $pct" ; } } close(INP) ; Section 2 #cumulative plot print "\tWriting data file for cumulative plot...\n" ; #huge files being written out. @sorted_p = sort { $a <=> $b } @p ; $i = 0 ; open (FIL, ">${o}data_cum") ; foreach (@sorted_p) { $i++ ; print FIL "$_ $i\n" ;} close(FIL) ; Section 3 #write a part of an array into a file print "\tWriting data file for scatter plot...\n" ; open (FIL, ">${o}data") ; for ($i=$remove; $i<=$#sorted_data_array; $i++) { if ($sorted_data_array[$i] =~ /(\S+)$/) { push(@p,$1) ; print FIL "$sorted_data_array[$i]\n" ; } } close(FIL) ;
|