|Problems? Is your data what you think it is?|
Bid data but need fast response timeby Anonymous Monk
|on May 21, 2012 at 13:36 UTC||Need Help??|
Anonymous Monk has asked for the
wisdom of the Perl Monks concerning the following question:
Dear Perl Monks,
I have a script which is fairly simple, but requires a huge amount of data from disk. I want to be able to run it from a web script, but the data load time is enormous (~30 seconds).
I see two solutions:
1. Use something like mmap to persist the data between calls to perl. (I am not sure, but I think this may happen automatically due to use of page cache. I am running Linux, btw). I thought I *might* need a super-simple holder process that holds the data in memory (And does nothing more.)
2. Use a client-server scheme. I like this less because of possible issues like memory leaks. Ideally, it would be set up so that the "user" enters a line via telnet, and the client reads back a one-line response. (Yes, I'll firewall the ports for safety and validate inputs.) I saw Net::Server and Daemon modules on CPAN. Is either preferred?
Ideally, I would like to be able to run the process on each request (less likely to have memory leaks, etc).
Any wisdom on which way to go would be appreciated,