Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl Monk, Perl Meditation
 
PerlMonks  

Re: Perl and Apache 1.3

by trwww (Priest)
on Oct 11, 2009 at 00:25 UTC ( #800493=note: print w/replies, xml ) Need Help??


in reply to Perl and Apache 1.3

We're running into the problem where we've got 20 apache children processes running which keep getting larger and larger as perl outputs our dynamic pages...

Then you've got a memory leak. This usually happens in perl when there is a circular reference to a variable and therefore it dosen't get properly refcounted. It can also happen when there is a global variable that keeps getting data added to it (imagine an array that gets data pushed on to it every request.

As a quick fix, lower the MaxRequestsPerChild to a low number. Maybe 100... or even 10. Otherwise, you're going to have to find the memory leak.

In general, if all of a program's variables are properly scoped then the memory footprint of the program will not continually grow. It will quickly grow to the amount of memory it needs to perform its task, and then level off.

all of the code is stored in RAM which is allocated to each apache process

If you load code and data in to mod_perl before apache forks its children, then the memory will be shared. How big are the files that make up the application? I can't imagine this part being too much of an issue.

Would the size of the file be stored in memory along with the compiled code?

How big are the .pdf files? If they are particularly large, you may want to figure out a way for the web app to hand off the .pdf generation to a different, short lived process that will run in its own memory space and return the memory to the OS after it finishes running. After it runs, do a http redirect to have apache directly serve the file.

But you definitely have a memory leak if the child httpds are continually growing and growing in memory size.

Replies are listed 'Best First'.
Re^2: Perl and Apache 1.3
by Heffstar (Acolyte) on Oct 11, 2009 at 17:23 UTC

    Well, I guess I should specifically say that these processes get larger only when they've got CPU activity associated. That would mean that more compiled code is being loaded into memory associated with the httpd process, right?

    Also, the associated PDFs are not huge (max 10MB) and the scripts that generate our application are usually around half a MB, but we have one that's about 10MB.

    Since we're talking a lot of Apache configuration, I'll pose a couple of related questions:

    MaxRequestsPerChild, when reached, causes the httpd process to terminate and if the load is high enough, another httpd process will start up, correct?

    Does anyone have any idea how long it would normally take to start up a new process? My boss seems to think that it's up to 20 seconds and that starting a new child is a VERY expensive operation.

    This server is a fairly decent machine though: 3GHz Xeon Dual Core, 4GB. It runs both Apache and MySQL for a few hundred users, but from what I've read, people who know configuration can get away with much, MUCH less...

      Basically wrapping this one up should anyone read this thread down the road...

      I've set the above settings to the following:
      MinSpareServers 5
      MaxSpareServers 10
      StartServers 8
      MaxRequestsPerChild 2500
      MaxClients 100

      The biggest change that I made was turning KeepAlive off. What I found was that with KeepAlive turned on, it would have to reach MaxKeepAliveRequests multiplied by MaxRequestsPerChild (2000 * 10000 = 20,000,000) prior to killing off the child process and releasing the memory associated. Just a tad high. Set at 2500, my process size reaches ~100MB then will die off.

      Also, by turning KeepAlive off, it allowed the server process to immediately serve another client rather than sit around waiting for "KeepAliveTimeout" seconds.

      Thanks for the help monks!

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://800493]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others rifling through the Monastery: (9)
As of 2019-03-25 11:37 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    How do you Carpe diem?





    Results (118 votes). Check out past polls.

    Notices?