Oct 17, 2008, 9:53 PM
Post #1 of 4
Hi, hopefully someone can help me out.
scripts die when calling other program, but only certian circumstances
Short version: my scripts is a GUI wrapper which directly/indirectly calls other applications, it runs ok with small datasets, but dies when I test it with large datasets.
I'm using Perl/Tk to design a GUI, in the scripts a subroutine A calls another Perl scripts call_B.pl, which in turn calls a third party application C.
I have lines like this:
my $call = system("call_B.pl -a option1 -b option2 -c option3");
my $res = $path . "$result_file_generated_by_application_C";
The lines work no problem when my datasets are small (1 Gigs), and with small datasets, it takes about 2 or 3 minutes for C to complete; however the line of 'my $res' complains uninitilized $result_file_generated_by_application_C when my testing datasets are big (10 Gigs); with big datasets, it usually takes 10-30 minutes for C to complete. In the failing case, I found no result_file_generated_by_application_C file generated. In another word, it looks like my scripts keeps going forward even the calling of call_B.pl (and then calling application C) does not finish.
I wonder if the problem is because my scripts calls B, which is submitted and returned, even C is still running. However, if my guess is right, then it should have problem too even when my datasets are small. Since no code available for application C, I think maybe I should try to incorporate call_B.pl codes into my scripts, which I hate to do because of legal issue.
Anyone knows what might cause the problem when my datasets are big?