We found a CGI script that called 'warn' too many times would hang the Perl process running under Apache (separate process). When you kill the process, the Apache error log then gets the first 648 calls to warn, but no more. This does not happen from the command line, nor running in the Eclipse CGI debugger (which doesn't use Apache).
The same thing happens if we merely print to STDOUT 1000 times; it hangs on 649 (based on browser stdout). 100 bytes per call * 648 = 64,800 bytes. (100 more bytes shouldn't be over 64*1024=65536 yet.)
Any ideas please?
Apache on windows (httpd-2.2.22-win32-x86-openssl-0.9.8t)
Perl v5.18.2 ( MSWin32-x86-multi-thread-64int - ActiveState)
Apache is running locally, writing to C:
You can try this with:
select( STDERR );
$| = 1;
select( STDOUT );
$| = 1;
# Write to stdout lot of times...
for ( my $lineNumber = 1; $lineNumber <= 1000; $lineNumber++ )
my $errMsg = substr( "Line $lineNumber....." x 10, 0, 99) . "\n";
print STDERR $errMsg; #called 649, writes 648x successfully;
+ 648*100=64,800; 64*1024=65536
#warn( $errMsg); #called 649, write 648x successfully
Any help is appreciated.