Being the average sophmore high school student, I am taking a course in Algebra II (okay, I want to brag somewhat, Algebra II Honors). Anyway, my mom is a high school math teacher and she often talks to me about how her math department doesn't care about proper notation. However, she finds that notation is a very important aspect of math. And I tend to agree. Just recently I noticed my teacher with some notation problems during her lectures having to do with disjuctions and solving equations.

Anyway, where's the programming part? I feel that part of my feeling of the importance of math notation has to do with the fact that I program (somewhat), in addition to the fact that my mom promotes it a lot . :-) In programming it is very important to have correct notation (if that is the word to used) or coding. The code must be exact with few variations allowed in each piece of code if you want to accomplish the same thing, the same way. For example, when using functions (built in or your own), sometimes you have the option of using parens or not, but very little varies in pieces of code in that way. Other examples are variable names and spaces between parameters, operators, etc. But ultimately, the code needs to be in a certain way to achieve your goal properly.

Has anyone else experienced something like this? Do you disagree with me on my analysis? Please respond!!

Zenon Zabinski | zdog |

Update: A little clarification of my point: in programming certain things *need* to be done a certain way, and I find that that mentality transfers over for me to math, where I find that certain things also need to be done a certain way, but people often don't, and that is what I am writing about...

Replies are listed 'Best First'.
(redmist) RE: Beyond programming...
by redmist (Deacon) on Oct 14, 2000 at 07:00 UTC
    I think that whatever form of notation that you choose to use, you should stick with. If you use parentheses with open() or not, or if you like spacing a certain way, or if you indent a certain way, be consistent. I know that in order to keep my own mental space clear of confusion, I really need to do thing the same way every time, in my own style. If I do not, I start to wonder why I did something differently in one place than I did in another when I am trying to accomplish the same thing.

      I think that while you should be consistent in your code you should understand what possible variations exist? If a operation can be overloaded, or called with empty parens, or with one variable or n variables. If the operation creates a new Object, What is the constructor? If the Object has more than one constructor which constructor is more efficient? This way you can understand other people's code much easier. Being able to understand other people's code is very important because you will no doubt have to read other people's code. When I was taking AI this semester we had to use a set of classes provided by our professor and enhance them for predicate calculus. Unfortunately, his code followed his own random syntax instead of the syntax he had been teaching in lecture and in the book. If you weren't able to understand the code you were basically up a creek. So comprehension is a huge factor as well as being consistant in what you code so that others who have to read large amounts of your code will be able to better understand your code.


      Everyone has their demons....
        Well, I don't know about object oriented programming, because I come from procedural programming. Perhaps there are reasons in OO programming to mix up style a bit...but I wouldn't know. But I <emp>am</emp> sure that no matter what kind of programming you are into, comprehension is the foremost factor when considering style (IMHO).

RE (tilly) 1: Beyond programming...
by tilly (Archbishop) on Oct 16, 2000 at 18:12 UTC
    You look at math through programming-colored eyes.

    Well I look at programming through math-colored eyes. :-)

    Not to disrespect your mother, but an important part of becoming good at math is the realization that notation doesn't matter. (-:Another is realizing that it does.:-) One of the best tricks I have seen for getting people to understand this is doing calculus using drawings of umbrellas and beach-balls for your variables.

    The point is that notation may be how we talk about ideas, but it should not be confused with those ideas. Indeed the "big" math discoveries either tend to be new fundamental ideas, or else discovering that here and there, even though they are using different words, they are really talking about the same thing and here is how to unify our understanding of them.

    So getting past confusing notation for ideas is key to becoming good at math. And programming as well! For instance on my home node there is a link to a discussion by Linus Torvalds on how to use macros instead of #ifdefs to make portable code easier to maintain. Well Perl doesn't have #ifdef, but if you understand the idea you can see how to create an abstraction layer of functions you call which (behind the scenes of course) will load platform specific behaviour. So even though we don't have the construct Linus talked about, we can separate out the idea he is getting at (how to get all of your portability assumptions into one place) from the notation (#ifdef) and benefit.

    The idea is what matters.

    OTOH notation also matters. Why? Because in the end you are talking to a human. So while your understanding should not be tied to (or confused with) the notation, you need to know how to say things in a way that others (and you!) will understand. And for that the notation matters a whole lot.

    Whew, a bit long but hopefully it will help you clarify your thoughts on the matter.