Beefy Boxes and Bandwidth Generously Provided by pair Networks
Come for the quick hacks, stay for the epiphanies.
 
PerlMonks  

What you refuse to see, is your worst trap

by tilly (Archbishop)
on Jun 30, 2003 at 05:26 UTC ( [id://270083]=perlmeditation: print w/replies, xml ) Need Help??

This meditation is inspired by two recent things that I learned about. One involved lizards, the other relates one of the common barriers that people have to becoming more competent.

As I say, the first involves lizards. While visiting a sister, she taught me how to catch them. If you don't know the trick this is pretty hard, lizards are very alert, and when you get near they hide very quickly. But the trick makes it easy. It turns out that lizards are incapable of perceiving a blade of grass as a threat. So you pick a long blade of grass, tie a noose, and hook the lizard. Which still isn't easy, but this time the lizard cooperates. If the noose is too small, you can bonk it a fair amount and it will just sit there. If the noose is too large it cooperatively steps through it then remains for you to adjust the noose and try again. I invite anyone who lives in an area with grass and lizards to try this.

It doesn't matter how good the lizard's reflexes are. It literally can't comprehend that this grass is different. It is nasty! Vicious! The grass is attacking! The lizard cannot see the threat, and so is easily trapped.

Of course it is trivial to say the same about people. What people refuse to see can still harm them, and can do so repeatedly. It is tempting to fill in a rant about organizational blindness, but I just ran across something which I think is a lot more interesting. (This might be because it is new to me, so I am just realizing the things that relate to it.)

Something that I noticed a long time ago is that people who think they are good at something usually aren't. For a random link on this, see Why the ignorant are blissful. Conversely people who actually are good at that, generally are painfully aware of their own shortcomings. This is often observed, just read through the comments at On Hubris to verify that. And it isn't just programming that it happens in. I have seen this in many fields, for instance read Jim Collins' observations on the most effective CEOs. Notice something familiar? :-) (For more detail than that blurb, the book Good to Great presents the conclusions of the study he talked about.)

This is old hat. I have known about this for years. There are easily guessable reasons why it should be so. What is different is that I just gained context about a non-trivial phenomenon that was not obvious (at least to me), and connects it to things I wouldn't have thought of. I got this insight from the book I am currently reading, The Psychology of Computer Programming (Silver Anniversary Edition). (By Gerald M. Weinberg.) In it he discusses the following experiment:

Two groups of subjects are asked to write an essay arguing in favour of some point with which they feel strong disagreement. One group is paid one dollar apiece to write this argument against their own opinions, the other is paid twenty dollars apiece. At the end of the experiment, the subjects are retested on their opinions of the matter.
The question is which group is more likely to change their opinions. Before you go on, stop and think about it and make up your own mind.

Most people will say it is the group that is paid more. That is what I said, and my thinking was, Well they are paid more, so they will be more diligent, and perhaps they succeed in convincing themselves.

Now let me give the theory behind what actually happens.

One thing that we tend to have a lot of trouble with in predicting things is that systems tend to be in stable equilibria - that is there is a status quo, and any attempt to change it causes a corrective force. This confuses us, because we think, "A causes B, so more A will result in more B." But more B causes a reaction that gets rid of B and our change shows up in some entirely unanticipated way. For a random instance, this article does a good job of explaining why safer cars result in worse drivers rather than more safety. (I found that in the Economics and Security Resource Page.)

As I say, various kinds of systems all around us maintain themselves in equilibrium. (There are several such in your body - without which you would die!) But the one of interest at the moment is that we strongly tend to maintain a positive self-image. I first remember this being put as, Nobody thinks of themselves as a jerk. Particularly if they are. A more prosaic way to understand this is that we have egos, and will engage in any kind of behaviour needed to protect them. We will revise history, get into endless arguments, make up stories about why we did things, and virtually any other kind of rationalizing behavior that you can imagine (or have ever seen someone engage in).

Now back to the experiment. We have had our subjects write something which they strongly disagree with. This creates a conflict that they have to get out of, namely, Why did I write something I disagree with? (This kind of conflict is called a "cognitive dissonance".) Now the people who were paid $20 (usually college students - to them $20 is quite a bit) have a ready-made excuse that has the advantage of being true, namely I lied because I was paid to. But the people who are paid a dollar have a problem, they aren't being paid enough to disagree with what was written. So they find an alternate escape route. One obvious one is, ...well maybe what I wrote wasn't so wrong after all? And they change their minds.

This is not just a theory. Psychologists have actually taken experimental guinea pigs college students and done this experiment. And the ones who are paid $1 change their minds more often than the ones paid $20. They won't, of course, say that they did it to protect their egos. They have much better rationalizations than that! But that is believed to be what they did.

Gerald Weinberg's book goes on to tie this to a commonly observed tendency while programming. Programmers have a strong tendency to associate themselves with their code. This ties the code to the programmer's ego. But then anything wrong with the code is an indictment of the programmer! The code can't have bugs, it must be Perl misbehaving. Or my module. Perhaps Windows isn't working? I'll bet it's those stupid users again! The program is perfect. It must be, after all I wrote it! And it works! (Honestly, who has never felt these things? Even if you won't publically admit to it right now?)

This tendency is very natural. But it makes debugging very difficult. It makes it even harder to accept constructive criticism. In fact just getting some programmers to give their code to someone who might disagree on its quality can be like pulling teeth.

What Weinberg suggests we do about this is engage in what he calls "egoless programming". Essentially this is to organize ourselves into groups which go out of their way to disassociate the programmer from the code, and constantly submits their code to each other for critical review. Unsurprisingly this works - another person generally has less trouble spotting obvious problems and is able to be much more critical. Furthermore people who get feedback like this quickly learn what they need to work on and can progress quite rapidly.

This idea is what lies behind the practice of code reviews. There is quite a bit of literature on how to do code reviews well. As the above indicates, the primary problem faced is that the programmer is likely to tie ego to code, and it is critical to try to make comments be about code, not the programmer, and to try not to bruise the fragile (and vigorously defended) ego. If you can succeed, the rewards are great. As noted in Rapid Development, code reviews properly implemented are the most cost-effective method of debugging known - and that that is before the obvious training benefits are considered. Unfortunately, try as you might, a certain number of people simply cannot distance themselves from their code and will flame out...

I think that this applies in an obvious way to the issue of competence and (over)confidence. Whatever people take pride in gets tied to their egos. Once something is tied to your ego, any useful feedback on how to improve is going to run into strong defence mechanisms. But, of course, if you cannot accept and integrate good feedback, then you can't possibly improve. And no matter how good your native talent might be, anyone who can get and remain on a good learning curve is going to pass you fairly quickly, and won't ever look back.

Confidence leads to being trapped at your current (in)competence. And you will refuse to see this..forever if need be.

Exercises:

  1. What does this have to do with parents who can't believe that their darling angels could have possibly (fill in the blank)?
  2. Have you ever had your code criticized? How did it feel?
  3. Think of a technical flamewar you have seen. Can you recognize how defence of egos fueled the flames? Was there a useful resolution after that happened?
  4. Think of a heated argument that you got into which went nowhere. Can you understand how the other person's ego was involved? (It is probably useless to try to see the challenges to your own ego - you have already rationalized them away. But if you want to try, go to whoever argued with you and ask how your ego was involved...) Can you think of a way you could have sidestepped that issue?
  5. Pick someone you know well. Can you think of anything which they might have trouble learning because their egos get in the way? Try the same exercise with yourself. Was that harder to answer? (Only volunteer the first part of your answer to the person you chose if you want to get into a fight...)
  6. Can you think of a way in which protection of egos affects the behaviour of people around you which I didn't talk about in this post?
A personal note. One mistake that I find I make along these lines is that I tend to focus on what _really_ happened. But if either I or the other person revised that history for ego reasons, an "unrevising" is unlikely to happen because it would re-introduce the cognitive dissonance. A technique which I am working on is to stop talking about actual events, and focus on my feelings about said events. The events can readily be disputed, the reality of my feelings is harder to dispute. And yes, this is part of my personal answer to exercise 4... :-)

Update The Mad Hatter noted that I said "is too small" twice, and the second time meant "is too large". Fixed. Then podmaster noted that I needed to insert "is" in "this often observed". Fixed. podmaster (again) pointed out that I wanted "thinks" in "Nobody things of themselves". Fixed. particle pointed out that it would be better if I credited Gerald M. Weinberg up front. Fixed. valdez points out that I should say that, "no animals were harmed during the making of this node." Good point, but I admit that some weren't entirely happy about my background research. :-) CukiMnstr pointed out that I have a tendency to write "tendancy". Fixed. CRConrad (friend from IWETHEY pointed out that the singular of phenomena is phenomenon. Fixed. He also pointed out that it does not work on ALL lizards, though anyone who wishes to verify that on a Komodo Dragon or a crocodile can be my guest. Noted. Also another poster from IWETHEY claims that it worked on his turtle. :-)

Replies are listed 'Best First'.
Re: What you refuse to see, is your worst trap
by chunlou (Curate) on Jun 30, 2003 at 07:29 UTC
    Another flap side of the "ego" story is that, it's not easy to explain to someone from nonengineering background why a programmer who wrote the code should not be a tester himself. They don't see why a programmer would be "blind" to his own code.

    Peer code review is a good practice. I never found it possible nonetheless when the programmers have been worked to their limit (coming in before dawn, or leaving late, sometimes, especially hard for those married with kids).

    In heated debate situations, I often based on the other party's arguments to construct a series of questions which lead the person to draw a contradiction himself, if there were flaws in the arguements. Work pretty well with any decently intelligent people, albeit stubborn for a moment(provided you're cool-headed yourself).

    I feel that in order for someone to be willing listen to you at all, it's best to be able to use and speak in his argument, terminology and language, that way, you become a Trojan horse arguing from within his ego instead of from without.

    Besides, rephrasing what someone said in your own language minimizes misunderstanding (which is especially important during requirements elicitation).
      Regarding "in heated debate"... Oh, yeah. All the time with code. Generally these are the debates that end with somebody suddenly stopping in mid-sentence, thinking really hard, yelling "Fuck!" as they hit themselves in the forehead, and stomping back to a keyboard to fix something.
Re: What you refuse to see, is your worst trap
by crouchingpenguin (Priest) on Jun 30, 2003 at 12:46 UTC

    Something that I noticed a long time ago is that people who think they are good at something usually aren't.

    Unskilled and Unaware of It has an intersting take on that. Specifically, it defines metacognition (coupled with metamemory, metacomprehension, and self-monitoring) as a set of skills allowing one to assess one's performance and accuracy. Such as, a novice programmer is ill equipped to gauge himself. Only further experience (or criticism from the journeymen and/or master programmers... or even from PM) will raise his self awareness concerning his skills and abilities. The article doesn't state definitively, however, that inflated self assessment comes from incompetence.

    ... people who actually are good ... generally are painfully aware of their own shortcomings

    Experience (and exposure to outside opinion as touched on later in your writeup) raises the ability for self assessment. It's often said and heard that "the more one learns, the more one learns how much there is yet to be learned".

    This confuses us, because we think, "A causes B, so more A will result in more B." But more B causes a reaction that gets rid of B and our change shows up in some entirely unanticipated way.

    Like when I optimize my path to work, shaving off 5-7 minutes from my commute. I should be that much earlier to arrive, giving myself more time to check the news, email, and browse PM. However, that time is actually realized as I hit the snooze one more time every morning. =]


    cp
    ----
    "Never be afraid to try something new. Remember, amateurs built the ark. Professionals built the Titanic."
Re: What you refuse to see, is your worst trap
by Abigail-II (Bishop) on Jun 30, 2003 at 11:43 UTC
    Given that the students paid 20 times as much didn't change their opinion as often, and hence had less problems with their ego, does that mean that if you pay a programmer 20 times as much, he more rapidly admits bugs in his program?

    Abigail

      I doubt it, but I wouldn't mind my boss trying that experiment with me. :-)


      Impossible Robot
Re: What you refuse to see, is your worst trap
by toma (Vicar) on Jun 30, 2003 at 06:15 UTC
    Good story about the lizards. Whenever I have trouble finding a bug, I say to myself, "It's where you're not looking."

    One of the things I like most about California is that I have reptiles living around me. I've built a few lizard-friendly small rock formations. Today I was cooking outdoors and a small (harmless) snake came by for a visit.

    I had never considered these tiny beasts as a source of coding inspiration before!

    It should work perfectly the first time! - toma

      Good story about the lizards. Whenever I have trouble finding a bug, I say to myself, "It's where you're not looking."

      It's a pretty safe bet that the bug is inside the lizard. :)

Re: What you refuse to see, is your worst trap
by traveler (Parson) on Jun 30, 2003 at 15:55 UTC
    I agree wholeheartedly with this meditation, but wish to add a word of caution. I generally relish peer review and comments about coding, algorithms etc. However, in a recent project an "expert" from outside the team was called in. He looked over the project and said, "use this structure instead". When I objected that his structure had faults a, b and c, he said, "think of egoless programming". His point was that clearly I could not see the faults in my own design. In fact, he spoke out of his lack of understanding -- his design was mildly easier to implement, but provided significantly reduced functionality. Occasionally, disagreement may be based in reality and not just ego protection.

    We have to be cautious of this and seek not just one but a broad range of opinions. This is true not only when others disagree with us, but when they agree as well... We also have to be open to suggestions of alternatives.

    --traveler

      That's "expert" is your "inferior," not your "peer."

      It happens often that a manager trying to seek outside opinion ends up with some phoney consultant.

      This could happen due to (but not limited to):
      • bad luck
      • "someone else's are better" mentality
      • just wanting to hear what one wants to hear
      • the "expert" being a friend of someone
      Once there's this consultant whose solution to every architectural problem is ASP and OOP. He even said OOP could be applied to SQL. (While I believe OO Design could be applied more generally, twisting OOP into not OOP language is just plain weird.)

      And his answer to why he hadn't propose any new architecture to our system was that we hadn't written up extensive enough our business rules.

      Another technical consultant was invited to tell us why the new MSSQL (it was 7 or 2000 or something?) was better (than whatever). It turned out to be a hour long sales pitch. One of his argument why the new MSSQL was good was MS spent billions on it. (If money guaranteed success, life would be a lot more plainer.)
      This meditation tends to valid my philosophy of "Never profess to be an expert on anything", because chances are I'm probably not.

      It also provides an interesting perspective on what I have observed about myself since I started working as a programmer (previously I was an engineer) a few months ago, namely that my programming methods and skills have been lacking in many areas, but are improving. Now I wonder how many other bad practices I have that I am blissfully unaware of.

      I suppose as long as I continue to look back at my previously written code in disgust/dismay etc, then I at least have some chance of continuing to improve. Of course, peer reviews are also a good idea, although I don’t know if my fragile ego will cope. :)

        Peer review. I agree it's rough. I don't want you guys looking at my work. It's embrassing. But how else to do it? Tough love but respect. Be nice to each toher. Work together. I'm in NJ: P.O. Box 438 Convent Station, NJ 07961, Claire Coombs. Mail Fraud: that's why I got the box. Tap my phone, too. It's dangerous out there. We need to stick together.
Re: What you refuse to see, is your worst trap
by mr_mischief (Monsignor) on Jun 30, 2003 at 16:20 UTC
    The Linux development effort, Extreme Programming, and several other communities have bourne this out without neccessarily taking the same path of discovery. Peer review and letting a code snippet speak for its own quality are known to be good things. This gives us some additional insight into reasons why.

    I actually think that there are other reasons why regular peer review of code is good, too. Sometimes, no matter how much ego you get out of the way, starign at the same problem too long just keeps you from finding the source of it. You look somewhere that looks like the source of the problem, build a conception in your head, and can't look anywhere else to find the bug. This particular problem happens even when you didn't write the code. Sometimes a fresh set of eyes is just needed to break you out of a rut.

    Christopher E. Stith

      Several XP discussions bring up the idea of "egoless programming". Weinberg's well-known in those circles.

      As far as XP goes, though, there's a whole lot more than just regular peer review. Pair programming is recommended for all code that lives longer than a day. The entire team owns the entire codebase; there are no little fiefdoms. Debugging's less important because the code is frighteningly well-tested. The test blindness is lessened, partly because of pair programming but mostly because of test-first development.

      XP's not perfect for everyone in every situation, but it takes into account many of these psychological issues.

Re: What you refuse to see, is your worst trap
by parv (Parson) on Jun 30, 2003 at 08:06 UTC

    Except the technique to catch lizards, that is one looong winded way to repeat that a programmer should not be emotionally attached to his/her code, is not the best person to test a program, and code should be reviewed by fellow programmers.

    That reminds me to repeat that one should code for somebody else in case the one coding gets hit by a bus sometime later...

    Edited (Jun 30 2003) to correct some grammar and the flow.

Re: What you refuse to see, is your worst trap
by demerphq (Chancellor) on Jul 01, 2003 at 18:26 UTC

    Im not so sure if I buy this "if you think you are good then you probably aren't". Ive noticed that most people that fall into the "i think i'm good" camp often are the same people who fall into the "I can do no wrong, and even when I do there is _always_ a damn good reason" camp as well. And IMO its the latter psychology that is really dangerous. Not being able to admit (to yourself) that you've done a stupid (forgive the juvenile term :-) is the biggest drawback to learning IMO. When you cant admit your mistakes you are doomed to repeat them over and over. Critical assesment is a crucial aspect of learning.

    For instance I personally think im a pretty good Perl programmer. (For some such definition :-)But when I look at my code (from before today :-) im always thinking, "gosh that was dumb", "what on earth was I smoking there?", "oh jeeze, did I do that?" and the like. (Actually my most feverent desire when i review my old code is to completely rewrite it.) To me this is a positive sign. Ive learned something between when I wrote the code and when i read the code that im now factoring in to my assesment. However Ive worked with people who when you say "what were you doing here?" they then give you half an hour of BS about why they had great reasons to write buggy code that didnt work (or why they overwrote your newest source files by being careless with source control...) And the thing I've found is that next week they will have done the same thing again. And the week following.... The point is becuase they wont register the fact that they made an error they wont register the solution to the error. Its impossible to learn to NOT do something when you have managed to convince yourself that it wasnt an error in the first place.

    Anyway, INAP, but I do think that there is a difference between the inability to admit you are wrong (or have done something stupid) and the type of ego that leads people to wander around telling everybody how good they are (and proving how bad they are with every extra word :-) The trouble is I think that virtually all of the latter are also the former. But occasionally you find someone who cant admit they are wrong but isnt an ego maniac. I think people like that are worse in some respects becuase its possible to develop a positive relationship with them and get fooled by their conduct for quite a while. Its only when you notice that _nothing_ is _ever_ their fault that you realize that youve been (essentially) conned. Ego maniacs on the other hand are readily identifiable so its much easier to be on your guard.

    Anyway, thanks for yet another insightful node. Itll be weeks before I get around to reading all the cool links you've provided. :-)


    ---
    demerphq

    <Elian> And I do take a kind of perverse pleasure in having an OO assembly language...
      My apologies for taking so long to get back to you on this. I have been offline for a bit.

      I think that your first paragraph has a nice piece of irony in it:

      Im not so sure if I buy this "if you think you are good then you probably aren't". Ive noticed that most people that fall into the "i think i'm good" camp often are the same people who fall into the "I can do no wrong, and even when I do there is _always_ a damn good reason" camp as well. And IMO its the latter psychology that is really dangerous. Not being able to admit (to yourself) that you've done a stupid (forgive the juvenile term :-) is the biggest drawback to learning IMO. When you cant admit your mistakes you are doomed to repeat them over and over. Critical assesment is a crucial aspect of learning.
      The first sentence says that you don't agree with my observation that, if you think you are good then you probably aren't. However the second sentence goes on to say that, on the odds, if you are someone who thinks you are good, then you probably someone whose feedback mechanism is broken. Which is what I said, except that I went into more detail on how the feedback mechanism gets broken.

      About the rest of what you said, it is a good point that it is unwise to associate what I was talking about with a single stereotypical pattern of external behaviour. Likewise I think that it is important to not associate it with a single pattern. A note in the silver edition about a chapter written 25 years earlier says that, Looking over the first edition, I see that I was already protecting myself from blame in the way programmers usually do -- by moving to the meta-level. This is definitely true for me. Rather than take pride in my code, I take pride in how well I learn. Which means that when I am confronted with ways in which I limit my ability to learn from others (for instance by causing common communication issues to come up repeatedly, eliminating opportunities to learn), it is hard for me to accept that the pattern repeats itself, and that my behaviour has something to do with it.

      In other words I suffer the same problem. Just metaed a level or two up. Which might be a better place to suffer the problem (or so I tell myself), but it still has limited me...

        your first paragraph has a nice piece of irony

        Well spotted. I hadn't noticed the inconsistancy. I guess my desire to point out that not only "I am the best" types suffer from this "I can do no wrong" syndrome overruled my logical facilites enough to overlook the fact that if most of the former are also the latter then the your base point was valid. I should know better than to not think my reasoning through before I post. :-)

        Although I do think its a mistake to focus on the bighead aspect and its more productive to look at why people sometimes get themselves in a head space where they take no responsibility for their errors. I think that perhaps even attacking the latter syndrome probably ends up reducing the former. I think actually that one of the links you posted (excellent reads by the way, the Forbes one got printed out given to some people at work, in my company the Level 5 analysis is particular pertinent) said much the same thing. (Teaching people more about a domain leads them to more accurately predict how much they dont know.)

        Once again many thanks for yet another stimulating node. Please keep posting!


        ---
        demerphq

        <Elian> And I do take a kind of perverse pleasure in having an OO assembly language...
Re: What you refuse to see, is your worst trap
by hsmyers (Canon) on Jun 30, 2003 at 16:09 UTC
    Looks to me like you have a clear choice, either move on to reading Weinbergs' work(s) on general systems thinking---An Introduction to General Systems Thinking: Silver Anniversary Edition or read D.T. Suzuki's Zen Mind, Beginner's Mind! Seriously (well that was, but only semi...) mapping out blind spots is difficult for obvious reasons, however, regardless of technique, most successful solutions require that the existence of said blind spots be admitted to first!

    --hsm

    "Never try to teach a pig to sing...it wastes your time and it annoys the pig."
What you refuse to see, can be your blessing
by zby (Vicar) on Jun 30, 2003 at 12:19 UTC
Re: What you refuse to see, is your worst trap
by mercurywings (Acolyte) on Jun 30, 2003 at 15:38 UTC
    Very interesting reading - explains in very clear terms what I've been thinking all along.

    I've always approached things in life with the phrase 'no matter how good you may think you are, there is someone who is even better.' And in my experience, if a person falls to the lure of hubris and considers themselves the best, fate inevitably intervenes to show them the error of their ways.

But there is some value in ego
by error 404 (Initiate) on Jul 03, 2003 at 00:19 UTC
    Without ego attatchment to one's work, there can be little pride or drive.

    Maybe that would be a good thing - without attatchment to code, we might demand something else - money, for example - for the all weekend coding sessions.

    Yah, what do I know - at this point nobody pays me to do anything but post old CDs on the web.

    Similarly, the lizard thing is probably a feature, not a bug, in all situations not involving very odd higher primates. Horses, I'm told, are known to suddenly perceive a rock or stump as a major threat, with unpleasant results.

Re: What you refuse to see, is your worst trap
by Anonymous Monk on Apr 29, 2004 at 20:51 UTC
    One thing which I think is good to clarify is, when you have a heated argument, what is your ego REALLY attached to? I find my own ego is rarely that closely attached to the code, but I still used to get in heated arguments. So what WAS my ego in? It was in the argument itself. I couldn't stand to lose. And yes, I have improved this, and haven't had such an argument in ages.

    However, there is an important distinction here. When my ego was in the argument, and not the thing being argued over, then I generally DID absorb what they had to say, and considered it rationally latter, and often changed my own opinions/techniques/whatever to match. I'd often even KNOW that I was wrong part way through the initial argument, but my ego was so far invested in it that I'd refuse to admit it verbally.

    So, what's my point? Just because somebody doesn't seem to be listening, and is arguing a seemingly stupid point to death doesn't mean that, on some level, they aren't listening. The best thing to do, if the argument gets heated, is to walk away, and then approach them about it the next day. If it turns into a heated argument again, THEN you know there is a problem (although a problem with WHOM is hard to say).
Re: What you refuse to see, is your worst trap
by sailortailorson (Scribe) on Nov 05, 2004 at 19:52 UTC
    I suspect that ego has a truly valuable role that we do not always realize, because of the times that it gets in the way.

    Like everyone, I often fight personal dragons. For me, these take the form of self doubt, usually regarding personal appearance, or sometimes the fact that my thinking does not fit in with my peers. I am not particularly weird looking, or weird thinking. But, let me put it this way: I only get a dollar for my presentation to the outside world. So, like the essayists, I am quite willing to think that I need to change something in order to fit in.

    But ego, in the form of confidence in a particular idea, or in a track record of solving problems in creative ways, can help me to get through these problems and present a better argument for whatever thing I need to get done.

    I think the real problem here is that the tools we have bred into us, by the billion or so years of selection, are not always perfect for the tasks that we now encounter daily. Engineering and maintaining code are the problems we are talking about.

    We are crossing the bridge from gene-based behavior to meme-based behavior. We've been doing this for awhile now. Remember, before we lived in cities, or even villages, we sat around in caves. Many behaviors must have taken place that would get you arrested by the city police now, but in those days, they were decided by physical strength and cunning.

    Physical strength and cunning are still useful, but civilized people had to realize that not every problem was a nail that required those hammers. Laws substituted, first religious, then civil. Then came guilds and professional organizations, which taught ways of the profession, and craft to obviate the reliance on laws. Then there are organizations like companies that layer their own way of doing things.

    I suspect it could be argued that arguments peel back the layering of social nicety - a reverse recapitulation, a mirror of how we develop. In a well-formed argument (the worst kind you could imagine), one party starts perhaps by kidding another, then perhaps one questions the professional wisdom of another. Then somehow the issue of legality, then morality enter in. At worst then, it could become a sheer fight for life by one in response to the other. As bad as it sounds, those tactics get things done. If someone spits on your doorstep, you proabably don't want to spend much time explaining to them why it is not a good idea.

    On the other hand, new ideas, I mean really new ideas, don't come from reliance on codified knowledge.

    I think the fundamental difference here is in how one reacts to the decision of whether to reach through the layers of mores, laws, brute force (let's call that reaching back) or reaching forward into new, uncharted territory. Reaching back only requires ego. Reaching back very far may require large, or combined egos.

    Reaching forward requires putting all those preconcieved systems away momentarily, and modeling the new world lines in your mind. Let's call this true thought (I call it true to distinguish from what often passes for thought, usually some kind of reaching back into professional knowledge or preconceived ideas. If it seems like I am sneaking in a subliminal direction, please call it some other kind of thought). In true thought, ego has no place, but as soon as the thinker begins to classify and codify any new insight, the thinker may need ego to have confidence in the idea and to communicate it before it is lost. Reaching forward, one must tread between egoless and ego states.

    I think that Tilly's focus on learning and meta behavior is good to uncover more about reaching forward. But, some ego is required in either direction. I think that code writing requires a lot of reaching forward, and a little reaching back. Other activities, like collaboration, code documentation may require somewhat less reaching forward (but they still require it) and more reaching back.

    I suspect that wisdom is a synonym for not having to reach back very far.

    The way I look at it is that I should never stop trying to strive for pure thought, and I should try not to interfere with those who have lately succeeded in applying pure thought to the benefit of me and other beings like me.

    Now, where's my dollar?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlmeditation [id://270083]
Approved by grinder
Front-paged by toma
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (4)
As of 2024-03-19 10:31 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found