Seems like this would be good for people skipping quickly
through many dynamic pages, but I wonder at what user volume
or click speed would you start to feel it.
To me this is interesting also in that I have been plagued with a
browser closing before a long log file was displayed
as running output of a C++ process in unix. Thought
I had tried keep-alive correctly (as an header in the
html page..) and the only answer I could find would be
to use a meta refresh which would reload the page periodically.
So what I have is a perl program called as cgi, which does
a lot of processing and calls various C/C++ programs, while
it and those process write a detailed log file, and simultaneously
the cgi program writes a much more terse description of
what is going on to the user's browser. The terseness of the
cgi output would cause a browser timeout, and the user would
think the C process died and try to launch it again from cgi.
This causes problems of course, what if it takes longer to
load the page than the meta refresh period? (disappears
while you are trying to read it, is what happens).
I wonder if writing a Content-Length header with a huge
number would keep the browser open indefinitely (i.e.
the browser logo would keep spinning forever)?