|Perl: the Markov chain saw|
XML::Twig size problemby slugger415 (Beadle)
|on Jun 23, 2012 at 00:21 UTC||Need Help??|
slugger415 has asked for the
wisdom of the Perl Monks concerning the following question:
Hello Monks, I have a CGI script using XML::Twig to parse <row> elements an XML file. I've discovered through a lot of testing that I get a "500 Internal Server Error" in the browser (Firefox) if the number of rows is too big (more than 4689 to be exact). I know it's not a problem with row #4690 in the file because I've tested it on different files and it crashes at the same number of rows.
I checked the HTTP server's error log and saw this:Premature end of script headers: my-script.pl
I'm using both the CGI and XML::Twig Perl modules. Is it running out of memory?
Is there a way to capture the error (other than what's in the server log) when Perl fails? The "500" error doesn't tell me much.
Here's the little bit of code that parses the file, and prints HTML when done:
BTW as a test I also had the row_processing subroutine/handler print every row it finds into a file as it goes along, and it makes it all the way through to the last one, so it's actually parsing everything before it barfs. I am clueless as to what's going on.
Any suggestions on how to troubleshoot this or about what the problem is would be most welcome.