|XP is just a number|
Profiling a large programby keymon (Beadle)
|on Oct 27, 2006 at 15:01 UTC||Need Help??|
keymon has asked for the
wisdom of the Perl Monks concerning the following question:
I come here seeking some tips on how to profile a rather large piece of Perl code, with 100s of classes, etc. which runs under Apache using modperl. This is an internal application that has grown over the years, and now threatens to eat everyone alive (ok, not really :) ). It is slow, and there are severe memory leaks. I would like to figure out what are the most time-consuming methods, and where's the memory being leaked.
If it were standalone, I could use DProf and slog my way through that. But are there any better ways?
Consider for example the memory leaks. Is there a way to dump out the allocated objects periodically, and see which ones are not being freed?
As for runtime: is there a way to add instrumentation at the very low level (instead of modifying each method in each class, one by one) to get a list of the top resource hogs?
Your help, advice and perls(!) of wisdom would be greatly appreciated.