|go ahead... be a heretic|
Re^3: RFC: Exploring Technical Debtby whakka (Hermit)
|on Sep 25, 2009 at 02:27 UTC||Need Help??|
I think to address the issue of terminology, the "debt" is simply all future productivity losses from "hacking" a short-term solution to a problem. You (the company or project owner) are "borrowing" from your developers' future productivity to increase their productivity today. Productivity (output/time) can be measured in value.
In terms of a loan, the "principal" is the short-term productivity gains by hacking solutions. The "interest rate" is the percentage loss in future productivity or the added expense in percentage terms of maintenance. It may be fixed over time as fixing bugs and adding features is more expensive by some constant factor. It probably compounds over time though as hacks are built upon hacks (and therefore be a variable increasing rate).
The interest is more important to medium- to long-term projects. It's also the explanation for why I read so often that salaried employees hate consultants setting up a system; the consultants are in essence borrowing from the future productivity of another companies' employees and yet they get to reap the short-term productivity gains in the form of getting paid for delivering on time - what a deal for them!
As a framework for pricing the loan I suggested thinking about it in the reverse; rather than acquiring technical "debt" you are implicitly forgoing technical "investments" - spending time today to realize greater output per time in the future. This is opportunity cost, and it has value. The choice becomes whether or not to make the investment.
Anti-regressive measures like refactoring in a sense have zero short-term productivity but positive long-term productivity. If short-term output is more important to a company then they have a higher discount rate and therefore losses in short-term productivity must be compensated by greater gains in long-term productivity.
There are two types of risk - one affects the estimates of the cost of the loan (return on investment) and the other the discount rate (time value of money). For the former, a big risk is staff turnover, not just in terms of employment but at the project level. If the developers of a system leave after it ships, future productivity losses from such loans are likely to be far greater than retaining people with an intimate knowledge of the system. For the latter, start-ups have a higher bankruptcy risk and therefore a higher discount rate. There are business and economic risks as well, such as lost market share and shocks to the labor market.
I don't think there can ever be a formal model since it's an impure science. My post was merely an outline of the required steps to perform a traditional financial calculus of the decision to take on a single debt item. It implied a lot of intermediate estimation and guesswork and for some loans it may not produce reasonable values.
And it can be argued, no matter the method used, that it's impossible to capture the positive externalities that are realized by committing to best practices. Developers are people after all, and people are arguably more motivated by being in a culture that cares about quality work and low-level input.