|Keep It Simple, Stupid|
You look at math through programming-colored eyes.
Well I look at programming through math-colored eyes. :-)
Not to disrespect your mother, but an important part of becoming good at math is the realization that notation doesn't matter. (-:Another is realizing that it does.:-) One of the best tricks I have seen for getting people to understand this is doing calculus using drawings of umbrellas and beach-balls for your variables.
The point is that notation may be how we talk about ideas, but it should not be confused with those ideas. Indeed the "big" math discoveries either tend to be new fundamental ideas, or else discovering that here and there, even though they are using different words, they are really talking about the same thing and here is how to unify our understanding of them.
So getting past confusing notation for ideas is key to becoming good at math. And programming as well! For instance on my home node there is a link to a discussion by Linus Torvalds on how to use macros instead of #ifdefs to make portable code easier to maintain. Well Perl doesn't have #ifdef, but if you understand the idea you can see how to create an abstraction layer of functions you call which (behind the scenes of course) will load platform specific behaviour. So even though we don't have the construct Linus talked about, we can separate out the idea he is getting at (how to get all of your portability assumptions into one place) from the notation (#ifdef) and benefit.
The idea is what matters.
OTOH notation also matters. Why? Because in the end you are talking to a human. So while your understanding should not be tied to (or confused with) the notation, you need to know how to say things in a way that others (and you!) will understand. And for that the notation matters a whole lot.
Whew, a bit long but hopefully it will help you clarify your thoughts on the matter.