I would definitely endorse this type of setup for a major project. There's little sense in building your own proprietary caching mechanism when HTTP already has one built into it.
Create your application such that it builds dynamic pages, and makes use of HTTP headers to identify how long a resource should be cached (if it should be). Then funnel all of your inbound traffic through a fast caching HTTP proxy server.
I have also seen approaches where the document root of the web server is actually an on-disk cache. A missing file would generate a call to a 404 handler which invokes the CGI or back-end process (maybe a "real" web server--behind another firewall maybe--that generates content) to generate the page (and perhaps caching it in the document root for future use by the web server).