http://www.perlmonks.org?node_id=186147


in reply to Perl/CGI Performance for a Shopping Cart

So I was wondering...

You ask a number of questions to which the proper answer is some variation on "it depends". For high volumes of traffic, you don't want to have to start up a new process external to the web server for each hit. That rules out CGI, though mod_perl is certainly a candidate. Or, you could arrange to run a separate shopping cart process that a thin web servlet (ASP, PHP, mod_perl, whatever) could delegate to via socket. I've seen setups like that handle high data rates, though they're more complicated to construct.

If your database is 500K, you could benefit from either keeping it in RAM, or loading it into MySQL. (Loading data into MySQL is left as a (simple) exercise.)

If you've already designed the shopping cart as a CGI, you have a good basis for experimentation. See how it scales against a 500K CSV on risk, then try moving it to RAM, then load it into MySQL. With a bit of stress testing, you should be able to make predictions on how the cart will scale.

If you're going to be renting your cart out, you'll also need to consider availablity, which leads to load balancing, which leads to more hardware and a bit more complexity in the software.

perrin has a good writeup on how eToys handled huge loads. See Building a Large-scale E-commerce Site with Apache and mod_perl.