Samn has asked for the wisdom of the Perl Monks concerning the following question:
Re: General questions on optimizing Perl performance
by RMGir (Prior) on Sep 06, 2002 at 11:41 UTC
|
Is mod_perl available on your host?
If not, every time your page is accessed a new perl process is started, services the request, then dies. That's a pretty heavy hit for the poor webserver to take.
I think you'd have to adapt your approach to work under mod_perl, though, but I'm not sure. Check out the mod_perl site.
I'm sure a lot of the other monks can give you more detailed advice.
How much code are we talking about, though? If it's not too bad, don't be totally opposed to the php idea. It could be a good educational exercise, if nothing else.
--
Mike | [reply] |
195614
by Samn (Monk) on Sep 06, 2002 at 11:51 UTC
|
| [reply] |
|
You can do a lot of damage in 92 lines. For example, even a very small database query is usually heavier than anything else in a Perl script, and opening a database connection takes time. Using a persistent environment like mod_perl takes care of the connection problem, and some caching could help with general performance.
If you don't have access to mod_perl, you could try to convince your web host to let you run some other things. Options include FastCGI, CGI::SpeedyCGI, and PPerl. You should also learn to use Devel::DProf, which will tell you where your script is spending its time.
| [reply] |
|
Hi,
as Mike said, main problem is the launch of the Perl Binary itself. Moving over to mod_perl can save _lots_ of CPU ("System") time. Beside that, Parsing only one time will save CPU ("User") time too, of course.
lg,
daniel
| [reply] |
Re: General questions on optimizing Perl performance
by ajt (Prior) on Sep 06, 2002 at 11:50 UTC
|
Samn
You don't say but I'm assuming you are running your Perl application via CGI, in which case the server is having to fork off a copy of Perl everytime a page is viewed, then close it down once it's finished with it. Unless you have thought of caching this is for EVERY page view, too. This uses up loads of memory & CPU and is a good way to make you unpopular with the system admin team.
Popular web sites that use Perl a lot, use mod_Perl, which starts the Perl code once when the server starts, and keeps it in memory. This means that a fresh copy of Perl is not started up for every page view.
Chances are you won't have access to mod_Perl, so you sys admin has suggested PHP instead, as they probably have PHP configured for use.
See also: Perl/CGI Performance for a Shopping Cart
--
ajt
| [reply] |
195616
by Samn (Monk) on Sep 06, 2002 at 12:01 UTC
|
| [reply] |
|
Assuming you have mod_Perl available, which is alas mostly not the case, if you have written your scripts well, you can more or less make them run in "PerlRun" mode, which will run many CGI scripts more or less unmodified under mod_Perl, and deliver a measureable performance boost by avoiding all that forking.
Ideally you want to create Apache::Registry scripts, where you can write normal Perl, and have Apache/mod_Perl compile and cache it for you. You have to make sure you have written good code, as any problems will bring down the whole web server - and the fear of this is an oft quoted reason why most hosting companies won't offer mod_Perl.
To get the most out of mod_Perl you'd skip the CGI emulation of the above aproaches, and write pure Perl directly into Apache, which is very cool, and not for the feint of heart.
I don't claim to be a mod_Perl expert, so I'm sure some other more learned monks can offer additional advice. I'd have a look here for a start though:
Update: Extra links added.
--
ajt
| [reply] |
|
| [reply] |
|
|