|Perl Monk, Perl Meditation|
This meditation is inspired by two recent things that I learned about. One involved lizards, the other relates one of the common barriers that people have to becoming more competent.
As I say, the first involves lizards. While visiting a sister, she taught me how to catch them. If you don't know the trick this is pretty hard, lizards are very alert, and when you get near they hide very quickly. But the trick makes it easy. It turns out that lizards are incapable of perceiving a blade of grass as a threat. So you pick a long blade of grass, tie a noose, and hook the lizard. Which still isn't easy, but this time the lizard cooperates. If the noose is too small, you can bonk it a fair amount and it will just sit there. If the noose is too large it cooperatively steps through it then remains for you to adjust the noose and try again. I invite anyone who lives in an area with grass and lizards to try this.
It doesn't matter how good the lizard's reflexes are. It literally can't comprehend that this grass is different. It is nasty! Vicious! The grass is attacking! The lizard cannot see the threat, and so is easily trapped.
Of course it is trivial to say the same about people. What people refuse to see can still harm them, and can do so repeatedly. It is tempting to fill in a rant about organizational blindness, but I just ran across something which I think is a lot more interesting. (This might be because it is new to me, so I am just realizing the things that relate to it.)
Something that I noticed a long time ago is that people who think they are good at something usually aren't. For a random link on this, see Why the ignorant are blissful. Conversely people who actually are good at that, generally are painfully aware of their own shortcomings. This is often observed, just read through the comments at On Hubris to verify that. And it isn't just programming that it happens in. I have seen this in many fields, for instance read Jim Collins' observations on the most effective CEOs. Notice something familiar? :-) (For more detail than that blurb, the book Good to Great presents the conclusions of the study he talked about.)
This is old hat. I have known about this for years. There are easily guessable reasons why it should be so. What is different is that I just gained context about a non-trivial phenomenon that was not obvious (at least to me), and connects it to things I wouldn't have thought of. I got this insight from the book I am currently reading, The Psychology of Computer Programming (Silver Anniversary Edition). (By Gerald M. Weinberg.) In it he discusses the following experiment:
Two groups of subjects are asked to write an essay arguing in favour of some point with which they feel strong disagreement. One group is paid one dollar apiece to write this argument against their own opinions, the other is paid twenty dollars apiece. At the end of the experiment, the subjects are retested on their opinions of the matter.The question is which group is more likely to change their opinions. Before you go on, stop and think about it and make up your own mind.
Most people will say it is the group that is paid more. That is what I said, and my thinking was, Well they are paid more, so they will be more diligent, and perhaps they succeed in convincing themselves.
Now let me give the theory behind what actually happens.
One thing that we tend to have a lot of trouble with in predicting things is that systems tend to be in stable equilibria - that is there is a status quo, and any attempt to change it causes a corrective force. This confuses us, because we think, "A causes B, so more A will result in more B." But more B causes a reaction that gets rid of B and our change shows up in some entirely unanticipated way. For a random instance, this article does a good job of explaining why safer cars result in worse drivers rather than more safety. (I found that in the Economics and Security Resource Page.)
As I say, various kinds of systems all around us maintain themselves in equilibrium. (There are several such in your body - without which you would die!) But the one of interest at the moment is that we strongly tend to maintain a positive self-image. I first remember this being put as, Nobody thinks of themselves as a jerk. Particularly if they are. A more prosaic way to understand this is that we have egos, and will engage in any kind of behaviour needed to protect them. We will revise history, get into endless arguments, make up stories about why we did things, and virtually any other kind of rationalizing behavior that you can imagine (or have ever seen someone engage in).
Now back to the experiment. We have had our subjects write something which they strongly disagree with. This creates a conflict that they have to get out of, namely, Why did I write something I disagree with? (This kind of conflict is called a "cognitive dissonance".) Now the people who were paid $20 (usually college students - to them $20 is quite a bit) have a ready-made excuse that has the advantage of being true, namely I lied because I was paid to. But the people who are paid a dollar have a problem, they aren't being paid enough to disagree with what was written. So they find an alternate escape route. One obvious one is, ...well maybe what I wrote wasn't so wrong after all? And they change their minds.
This is not just a theory. Psychologists have actually taken
Gerald Weinberg's book goes on to tie this to a commonly observed tendency while programming. Programmers have a strong tendency to associate themselves with their code. This ties the code to the programmer's ego. But then anything wrong with the code is an indictment of the programmer! The code can't have bugs, it must be Perl misbehaving. Or my module. Perhaps Windows isn't working? I'll bet it's those stupid users again! The program is perfect. It must be, after all I wrote it! And it works! (Honestly, who has never felt these things? Even if you won't publically admit to it right now?)
This tendency is very natural. But it makes debugging very difficult. It makes it even harder to accept constructive criticism. In fact just getting some programmers to give their code to someone who might disagree on its quality can be like pulling teeth.
What Weinberg suggests we do about this is engage in what he calls "egoless programming". Essentially this is to organize ourselves into groups which go out of their way to disassociate the programmer from the code, and constantly submits their code to each other for critical review. Unsurprisingly this works - another person generally has less trouble spotting obvious problems and is able to be much more critical. Furthermore people who get feedback like this quickly learn what they need to work on and can progress quite rapidly.
This idea is what lies behind the practice of code reviews. There is quite a bit of literature on how to do code reviews well. As the above indicates, the primary problem faced is that the programmer is likely to tie ego to code, and it is critical to try to make comments be about code, not the programmer, and to try not to bruise the fragile (and vigorously defended) ego. If you can succeed, the rewards are great. As noted in Rapid Development, code reviews properly implemented are the most cost-effective method of debugging known - and that that is before the obvious training benefits are considered. Unfortunately, try as you might, a certain number of people simply cannot distance themselves from their code and will flame out...
I think that this applies in an obvious way to the issue of competence and (over)confidence. Whatever people take pride in gets tied to their egos. Once something is tied to your ego, any useful feedback on how to improve is going to run into strong defence mechanisms. But, of course, if you cannot accept and integrate good feedback, then you can't possibly improve. And no matter how good your native talent might be, anyone who can get and remain on a good learning curve is going to pass you fairly quickly, and won't ever look back.
Confidence leads to being trapped at your current (in)competence. And you will refuse to see this..forever if need be.
Update The Mad Hatter noted that I said "is too small" twice, and the second time meant "is too large". Fixed. Then podmaster noted that I needed to insert "is" in "this often observed". Fixed. podmaster (again) pointed out that I wanted "thinks" in "Nobody things of themselves". Fixed. particle pointed out that it would be better if I credited Gerald M. Weinberg up front. Fixed. valdez points out that I should say that, "no animals were harmed during the making of this node." Good point, but I admit that some weren't entirely happy about my background research. :-) CukiMnstr pointed out that I have a tendency to write "tendancy". Fixed. CRConrad (friend from IWETHEY pointed out that the singular of phenomena is phenomenon. Fixed. He also pointed out that it does not work on ALL lizards, though anyone who wishes to verify that on a Komodo Dragon or a crocodile can be my guest. Noted. Also another poster from IWETHEY claims that it worked on his turtle. :-)