Well, there are lots of non-free tools to do what you want as
well as loads of "theories" (and some of them are loads).
One of the more reasonable approaches is the

McCabe Complexity Metric.
Basically by loooking at each function (method,procedure,etc)

- Start with 1 for the "straight" path through the function
- Add 1 for each for, if, while, and, or.
- Add 1 for each case in a case statement

The number you come up with is the "complexity" of the
function. Average the complexity of all the functions and
you have the complexity of the codebase. The idea here is the lower the number, the less complex the code. The less complex the code, the better chances for higher quality. There's a lot more there (like if the function has a complexity of 1, does it really need to be its own function).
I don't know of a tool to do this for perl but there's one
on

freshmeat
for other languages.

While strict adherence to complexity metrics can drive you
crazy, they actually fit nicely into the programming mantra -
high cohesion, low coupling. There's pretty strong evidence that if a function is doing a lot of conditionals it probably
has low cohesion.

-derby

Comment onRe: Programs/Methods for analyzing an existing Perl based system