Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Re: Programs/Methods for analyzing an existing Perl based system

by derby (Abbot)
on May 30, 2002 at 02:14 UTC ( #170280=note: print w/ replies, xml ) Need Help??


in reply to Programs/Methods for analyzing an existing Perl based system

Well, there are lots of non-free tools to do what you want as well as loads of "theories" (and some of them are loads). One of the more reasonable approaches is the McCabe Complexity Metric. Basically by loooking at each function (method,procedure,etc)

  • Start with 1 for the "straight" path through the function
  • Add 1 for each for, if, while, and, or.
  • Add 1 for each case in a case statement
The number you come up with is the "complexity" of the function. Average the complexity of all the functions and you have the complexity of the codebase. The idea here is the lower the number, the less complex the code. The less complex the code, the better chances for higher quality. There's a lot more there (like if the function has a complexity of 1, does it really need to be its own function). I don't know of a tool to do this for perl but there's one on freshmeat for other languages.

While strict adherence to complexity metrics can drive you crazy, they actually fit nicely into the programming mantra - high cohesion, low coupling. There's pretty strong evidence that if a function is doing a lot of conditionals it probably has low cohesion.

-derby


Comment on Re: Programs/Methods for analyzing an existing Perl based system
Re: Programs/Methods for analyzing an existing Perl based system
by Abigail-II (Bishop) on May 30, 2002 at 12:10 UTC
    There's something fundamentally wrong with measuring complexity based on low level analyses of code and using the outcome to judge the quality of code.

    Most people will agree the grammar of the musings of Shakespeare is much more complex than Dr Suess books. Does that mean the childrens books have a higher quality than the plays?

    There are other problems as well. Such analyses can only focus on a particular implementation. It doesn't cast any judgement on a proper algorithm. It will favour a linear seach of a sorted array over a binary search, because the linear search requires less conditions to implement.

    It doesn't mean you shouldn't use such a tool. It just means that you have to be very careful with what you do with its results.

    Abigail

      A2,

      There's something fundamentally wrong with measuring complexity based on low level analyses of code and using the outcome to judge the quality of code

      Didn't think I was. Everything else++. Except the part about Shakespeare and Suess - that's just silly.

      -derby

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://170280]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others browsing the Monastery: (7)
As of 2014-07-24 04:27 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (157 votes), past polls