Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight
 
PerlMonks  

Re: (tye)Re: Get your script warnings on the page it generates

by BooK (Curate)
on Jan 18, 2001 at 05:42 UTC ( [id://52666]=note: print w/replies, xml ) Need Help??


in reply to (tye)Re: Get your script warnings on the page it generates
in thread Get your script warnings on the page it generates

You are absolutely right, it would! (and I didn't think of it... Just wanted to avoid checking that log file)

I guess this is the reason why this kind of tricks should be use while coding the script, and not in production...

But I wouldn't want to show such intimate stuff as warnings to the people who surf my site...

Update 2: When Apache waits too long for a script and there is still no output, it seems to kill the subprocess... Then we enter the END block, and the warnings are printed (I guess).

Update: If you are really concerned about this, you can always fork a child, and use the pipe to pass it the warnings... The child will store them in an array or a string, and pass them back when your end of the pipe closes... Much heavier, but feasible.

BEGIN { pipe(READ, WRITE); *STDERR = *WRITE; if(!open(DEBUG, "-|") { # child close WRITE; my @w = <READ>; print @w; exit } } END { close WRITE; print '<hr><b>Script warnings</b>', '<small><pre>', <DEBUG>, '</pre></small>' }
(This is untested)

Question: is the END block executed by the child process? If so, where goes the probable warning about the closed DEBUG handle?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://52666]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (3)
As of 2025-06-16 03:10 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.