|Perl: the Markov chain saw|
scripts die when calling other program, but only certian circumstancesby xxqtony (Initiate)
|on Oct 18, 2008 at 05:15 UTC||Need Help??|
xxqtony has asked for the
wisdom of the Perl Monks concerning the following question:
Hi, hopefully someone can help me out.
Short version: my scripts is a GUI wrapper which directly/indirectly calls other applications, it runs ok with small datasets, but dies when I test it with large datasets.
Long form: I'm using Perl/Tk to design a GUI, in the scripts a subroutine A calls another Perl scripts call_B.pl, which in turn calls a third party application C. I have lines like this:
my $call = system("call_B.pl -a option1 -b option2 -c option3");
my $res = $path . "$result_file_generated_by_application_C";The lines work no problem when my datasets are small (1 Gigs), and with small datasets, it takes about 2 or 3 minutes for C to complete; however the line of 'my $res' complains uninitilized $result_file_generated_by_application_C when my testing datasets are big (10 Gigs); with big datasets, it usually takes 10-30 minutes for C to complete. In the failing case, I found no result_file_generated_by_application_C file generated. In another word, it looks like my scripts keeps going forward even the calling of call_B.pl (and then calling application C) does not finish.
I wonder if the problem is because my scripts calls B, which is submitted and returned, even C is still running. However, if my guess is right, then it should have problem too even when my datasets are small. Since no code available for application C, I think maybe I should try to incorporate call_B.pl codes into my scripts, which I hate to do because of legal issue.Anyone knows what might cause the problem when my datasets are big? I don't think it's memory issue, because it can run in non-GUI version no problem.