Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

Re: Question on design for FastCGI

by Anonymous Monk
on Oct 15, 2012 at 15:54 UTC ( #999109=note: print w/ replies, xml ) Need Help??


in reply to Question on design for FastCGI

Which of these is the better approach with respect to FastCGI?

It really doesn't matter all that much, you can even have independent fcgi scripts that use modules!

Take perlmonks for example, index.pl does everything, but its really N-number of index.pl running on N-different machines -- its the same code, same module running from one script, for maximum maximum benefit from persistence (responsiveness)

The reason we use modules is they make code-reuse and testing individual functions easy

So which parts you choose to keep persistent depends on usage rates/patterns-- keep the most popular stuff persistent, keep response times low ...


Comment on Re: Question on design for FastCGI
Re^2: Question on design for FastCGI
by Anonymous Monk on Oct 16, 2012 at 13:18 UTC

    As I only have a beginning grasp of FastCGI, there are some concepts that I'm still grappling with.

    Let's say I go with Approach 1 with the modules imported before the start of the loop. Then later in loop, one of the subroutines in say, Module2, is called. Am I right to say that because the code in Modole2 persists between responses, calling the subroutine is faster than if it were plain CGI?

      Am I right to say that because the code in Modole2 persists between responses, calling the subroutine is faster than if it were plain CGI?

      Yes, its faster because the module is already loaded and ready to run, no time needs to be spent on loading it

      CGI starts a new process for each HTTP request, so program/modules/loaded, then program ended, memory freed

      This takes time. Compilation takes time, finding the modules, loading the modules, allocating memory ... freeing memory all take time.

      So for 1000 requests CGI takes , if it takes 1 second for program/modules to load/unload, and 1 second to run, that is half the time spent just loading/unloading the program, 2000 seconds total

      To a user of your website, this translates to a slow website. It doesn't matter if you and your users have super fast internet connections, if you're running a load balancing proxy... if your HTML is optimized to render quickly and your program is super fast, its always going to take at least 1 extra second for every single request

      With FastCGI, its 1001 seconds, 1000 seconds for 1000 HTTP requests, and 1 second to load the program/modules once

      To put another way, boy/girl-scouts/pioneers are taught not to walk with their knives or leave them laying around but to stow-away their knives (back into the sheath) when they're not cutting -- basically like CGI -- great for safety , acceptable when responsiveness isn't a priority

      But this wouldn't work for a chef, it would be way to slow. Which is why a chef has a work-space with a chopping block to leave his knife -- knife comes out of the sheath at start of day, and only goes back into sheath at the end of the day -- this is FastCGI

        Thanks for the clear explanation.

        FastCGI is harder than I had imagined at this point. I supposed I'm too accustomed to how non-persistent code works. Now, when I try to recode parts of the site to use FastCGI, I keep running into problems. For instance, with plain CGI + sessions, after a user logs out and tries to access a logged in page, my old plain CGI code works against it, so he's prevented from being logged in.

        When I try to change that bit to use FastCGI, I discovered that by accessing a previously logged-in page after having logged out, a user can become logged in again - found out that it was probably due to the session being restored or something.

        I've this nagging question and hope you can help enlighten me. Suppose I've some variables that hold different values depending on who's logged in e.g. no of points in a game. Am I right to say I should not set variables like this before the FastCGI loop, but to get their values for each response within the loop? Because if - what I think is happening - I set them before the loop, their values are retained due to the persistence and so different users may see the same values - or something to that effect. How do I prevent bugs like this from happening?

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://999109]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others romping around the Monastery: (8)
As of 2014-11-23 10:15 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My preferred Perl binaries come from:














    Results (130 votes), past polls