So here's the problem: I have a CGI application for creating web
pages which currently stores a representation of the links and
data of a web site in a perl
hash-of-hashes structure that I write to disk with Data::Dumper
(see http://chicodigital.com/webtool.html). This is fine for
a dozen medium sized web pages, but at some point, loading the data
file for each CGI execution is going to make this app slow. Using
mod_perl would solve this (right?), but my target users are either
on shared servers (and thus don't have mod_perl) or don't want to
deal with setting up a database. I also know I can use Storable.pm
as a faster file storage solution, but that still writes to disk.
So I started thinking about how it might be possible to retrieve
persistent data through a Unix socket.
Here's the idea: Multiple CGI processes all talk to a persistent
process that just hands them the perl hash (or accepts an updated
hash) through a Unix socket. The process would die after a timeout
period. It would be started by the inital CGI request, which would
check if the process existed and create one if it didn't. Another
way to think of this idea: it's just a home-made database connection,
but the data is cached in memory (it doesn't read from disk on every
request, only the first one, or after a modify).
Is this possible? (I think it is). Is this just a bad idea?
Comments? I'm not avoiding mySQL, but I am catering to users who
either don't have access to a database or don't want to set one