http://www.perlmonks.org?node_id=1227965

kimya8 has asked for the wisdom of the Perl Monks concerning the following question:

Hello all, I am maintaining Perl legacy code where template toolkit is being used to generate a XML file. So we provide the template file, the variables, the template object and the location of the output file(where I want the output to be dumped).This goes to the Perl template toolkit and it provides me with the output file. I believe the toolkit does the magic of parsing the variables and the template for creating the output.(Correct me if i am wrong). Now it all works fine until the input data(the variables) size is too much and my Perl crashes(Perl command line interpreter has stopped working). I have a process explorer where i have observed the memory size and Perl consumes around 1.3 GB before crashing. I am providing the code below. If there is anything else needed kindly let me know as this is my first post here

# Get template object my $tobj = $self->_getTemplateObject(); # Set template variables my $vars = {Testrun => $self, Environment => $environment, RunData => $rundata, Filter => $self->getFilterObject(), }; # If file option was supplied, then output to a file if ($file) { #getTemplateFile - access the template file in correct format if ($tobj->process($self->_getTemplateFile(), $vars, $file)) { return; } }