Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

opcode caching / speed-up app

by rpike (Scribe)
on Sep 14, 2013 at 11:11 UTC ( #1054091=perlquestion: print w/ replies, xml ) Need Help??
rpike has asked for the wisdom of the Perl Monks concerning the following question:

Is there a way to speed up my perl app? I've read a little about opcode caching, is there a way to bypass the "compilation" phase in Perl? The application has already been written with little consideration given to scalability because it was a mere intranet app in the beginning and now (being public) it has come to it's knees when being used by more and more users. Any suggestions on how to speed up the application would be appreciated. Thanks.

Comment on opcode caching / speed-up app
Re: opcode caching / speed-up app
by moritz (Cardinal) on Sep 14, 2013 at 11:20 UTC

    The perlperf manual page has some advice.

    In my opinion, the most important part is profile first, so that you know where the slow parts are. Devel::NYTProf is the best profiler I've ever used, and I can highly recommend it.

    If your application is backed by a database, it might also worth investigating if indexes might be missing. Most database servers the capability to log slow queries, which you should use and act upon.

      Thanks moritz. I tried NYTProf for the first time 2 days ago and it seemed pretty good. The 'database' is actually textfiles. There are 3-4 XML documents used within the application, XML::Simple opens and parses them. I appreciate the response.
        The 'database' is actually textfiles. There are 3-4 XML documents used within the application, XML::Simple opens and parses them. I appreciate the response.
        So it sounds like you're running a CGI that gets compiled every time and also loads and parses its entire data set every time. That may well be the problem and something like SQlite could bring a big boost there. And/or, as others mentioned, FastCGI.
Re: opcode caching
by ww (Bishop) on Sep 14, 2013 at 11:30 UTC
    Profiling may help. But "... come to it's (sic) knees" doesn't.
    If I've misconstrued your question or the logic needed to answer it, I offer my apologies to all those electrons which were inconvenienced by the creation of this post.
Re: opcode caching / speed-up app
by tobyink (Abbot) on Sep 14, 2013 at 14:28 UTC

    Given that you say it's an intranet app, I'll assume it's web-based. There's essentially two ways Perl can power a web-based app: via CGI; or via some sort of persistent mechanism (FastCGI, mod_perl, Perl-native web servers).

    If you're using a persistent mechanism, a single Perl process (and thus a single compilation phase) serves multiple (potentially millions of) requests. So speeding up the compilation phase serves no purpose at all.

    If you're not using a persistent mechanism, then a new Perl process is spawned for every request, which has to parse and compile your script, along with all the modules you're using. In this case an opcode cache would give you some extra performance; however switching to a persistent mechanism would give you a far bigger boost.

    use Moops; class Cow :rw { has name => (default => 'Ermintrude') }; say Cow->new->name
Re: opcode caching / speed-up app
by TJPride (Pilgrim) on Sep 14, 2013 at 18:53 UTC
    Sounds to me like the major issue is probably the XML parsing. Depending on what you're doing, you may want to switch to a relational database. Alternately, you can just add more servers and distribute the load.

    Is this a secret or can you link us to the actual app so we can get a better idea of where the hang-up may be and how you can improve things?

      Sorry, I won't be able to provide a link to it. I appreciate the advice. I'll look into more into profiling but I'm sure some sort of implementation of something like FastCGI will be needed as a point. Would you happen to have a good (simplified) link on how to use FastCGI? The apps are tested and used on Windows and later installed on both Windows and Linux. Thanks again.
        The work I do is more low-volume, high-processor utility runs, and in the instances where I have to deal with high-volume traffic, I use PHP instead (yes, this is a Perl site, but both languages have their strengths and weaknesses, imho). Someone more pure Perl will have to supply you with info on FastCGI.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1054091]
Approved by ww
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others pondering the Monastery: (10)
As of 2014-07-11 00:30 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    When choosing user names for websites, I prefer to use:








    Results (217 votes), past polls