http://www.perlmonks.org?node_id=352274


in reply to Re: Re: Is (DOS-)Perl a Memory Hog?
in thread Is (DOS-)Perl a Memory Hog?

You've given me serious flashbacks to the days we spent hour upon hour with tweaking memory loading arrangements and rebooting over and over until some critical DOS app could run... QEMM... memmaker... *shudder*.

I'd suggest you stick with batch files as much as you can, as they will give you the maximum memory possible, then call your various Perl scripts at the appropriate points in the batch files. They should still be able to read/write your DOS apps config files and input/output data files. I don't think you gain much by using a DOS shell from Perl, as the host script can't do anything until the shell exits. If you need to maintain state info, then have your scripts write variables into your own data files in order to pass information to subsequent invocations in the batch chain. There are lots of interesting batch tricks that might help, and if you are careful (and my memory serves) I believe you can even re-write the current batch file being executed as it is only run line-by-line.

If you tell us more about your particular needs, some of us may be more helpful.

--
I'd like to be able to assign to an luser