For many people, the average number of lines of code (LOC) per day will seem to be a silly metric. I wonder how it gets measured. For example, if one programmer produces twice as many LOC as another programmer but 5 times as many bugs, is the first programmer more productive? Possibly, if the second programmer has a bug rate about one fifth the average.
For my current project, I determined that I was producing about 150 lines of code a day (more or less). I actually felt pretty happy with that given that the requirements were very sketchy and I had to repeatedly rewrite large portions of the code. I also think the code is reasonably bug free due to a fairly decent set of tests, but what's a success today may prove a failure tomorrow as the client decides that they need something done a little bit differently.
Unfortunately, I don't know if 150 is really productive and some days I found myself with a negative count as I pulled code that wasn't needed. Perhaps function points would be a better metric? I'm not sure. The end customer was playing a shell game with requirements and several times I've made sweeping changes to the code to implement a minor, but very annoying change. How do you really measure personal productivity and, more importantly, how do you justify it to a client? My current client has been fairly understanding about the requirements problem but I do have concerns if issues like this occur in the future.
New address of my CGI Course.