perlquestion
slugger415
<p>Hello Monks, I have a CGI script using XML::Twig to parse <row> elements an XML file. I've discovered through a lot of testing that I get a "500 Internal Server Error" in the browser (Firefox) if the number of rows is too big (more than 4689 to be exact). I know it's not a problem with row #4690 in the file because I've tested it on different files and it crashes at the same number of rows.</p>
<p>I checked the HTTP server's error log and saw this:</p>
<code>Premature end of script headers: my-script.pl</code>
<p>I'm using both the CGI and XML::Twig Perl modules. Is it running out of memory?</p>
<p>Is there a way to capture the error (other than what's in the server log) when Perl fails? The "500" error doesn't tell me much.</p>
<p>Here's the little bit of code that parses the file, and prints HTML when done:</p>
<code>
use CGI;
use CGI::Pretty;
use CGI::Carp qw( fatalsToBrowser );
use XML::Twig;
use strict;
# set up the parser:
my $twig= XML::Twig->new(
twig_handlers => { row => \&row_processing },
);
$twig->parsefile("$outputDir/my-zos-shorter.xml");
print $q->header,
$q->start_html(-title=>$TITLE,
-style=>{'src'=>$stylesheet}),
"<a name='top'></a>\n",
$q->h2('Here are your results!'),
$q->end_html;
</code>
<p>BTW as a test I also had the row_processing subroutine/handler print every row it finds into a file as it goes along, and it makes it all the way through to the last one, so it's actually parsing everything before it barfs. I am clueless as to what's going on.</p>
<p>Any suggestions on how to troubleshoot this or about what the problem is would be most welcome.</p>
<p>thanks, Scott</p>