in reply to
What??? You wanna learn math?
Just a quick note on the
O(n)
or socalled "BigO Notation" used by computer
science folks. It's not really "math"^{1} so much as it is
an approximation. You can't get an exact number out of it,
per se, and it isn't really something you can do algebra
on in a direct sense. Quoting from that link:
Definition: A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n,
which is usually the number of items. Informally, saying some equation f(n) = O(g(n)) means it is less than some constant multiple
of g(n). More formally it means there are positive constants c and k, such that 0 <= f(n) <= cg(n) for all n >= k. The values of c and k
must be fixed for the function f and must not depend on n.
A first year computer science textbook is certainly in order
if you want to discover some of the theory behind the way
algorithms work. The classic White Book is a
popular choice of professors and students alike. One of the
authors is Ronald L. Rivest, the "R" in RSA encryption.
It introduces you to a variety of different algorithms
used to find substrings, sort lists, and navigate networks.
Some of this is redundant considering Perl does a lot of
this for you, but understanding the theory behind it, and
the work you are making Perl do, can help you write better
code.
The other tool that would be handy is a graphing calculator
program, or even Excel, where you could familiarize yourself
with the shape of various functions such as log(n).
In computer science terms, these shapes are used to approximate
how a function will scale, or how long it will take to run
given a dataset of size n.
Here's a quick rundown on some popular types:
 O(n)  As the data doubles in size, the algorithm
takes twice as long to run.
 O(n^{2})  As the data doubles in size,
the algorithm takes four times as long to run.
 O(n^{3})  As the data doubles in size,
the algorithm takes eight times as long to run.
 O(log n)  As the data size increases, the amount
of time the algorithm takes to run increases, but increases
less and less as the size increases more and more.
 O(e^{n})  As the data size increases,
the amount of time the algorithm takes to run increases
exponentially, becoming very long very quickly.
For example, if you had a sort function which runs in O(n)
time, it is fairly efficient because if you try and sort 20
things, it only takes about twice as long as sorting 10.
If you had another sort function that ran in O(n^{3})
time, sorting 20 things would take eight times longer.
1. Okay, so maybe adamsj is correct in saying that it
is math, but I was merely trying to suggest that it's
not, well, mathy math, which is another way of saying
that understanding O(n) is not quite as hard as, say,
learning multidimensional Calculus.
I would have to agree that O(n)notation is math inasmuch
as it uses math to express, not surprisingly,
mathematically how these functions are expected
to run.
Re: BigO Notation by adamsj (Hermit) on Jul 05, 2001 at 22:10 UTC 
As a former bookseller and math student, I want to second your nomination of Introduction to AlgorithmsI wish my copy weren't hundreds of miles awayand also want to quibble with what you said about O(f(n)).
It is math.^{1}
A lot of interesting mathespecially math of interest to computing scienceis done through approximation rather than turning out exact answers. I, too, used to look down on that as not real math, but I was wrong.
Now that that's out of my system, here's a little more good advice from my bookseller self:
Don't buy a current used textbook. Instead, look for an old edition. Trust me on thislittle has changed in calculus and trig in the last century or soand old texts are worthless to the college bookstore. If they have any, they probably have them out on the dollar a sack sale table.
If they don't have them out there, ask the textbook manager or try a used nontextbook store, and don't pay over five bucksten at the mostor ask around for a freebie at the math department at the local colleges.
There is a very good chance that your math teacher can't help youmine couldn't (but he was a good track coach, I'm told), when I took on calculus in tenth grade. What he did do for me (since he couldn't explain how the Chain Rule was derived) was tell me that if I'd sit in class and quietly study calculus, he'd grade me just on my text scores (i.e., no algebra homework). That I wasn't supposed to bug him with calculus questions went unsaid.
This doesn't mean that your teacher won't be able to help you directly with the materialmore likely she can than can't, but it's not a slam dunkbut it does mean that she can surely help you in some manner. However, the degree of help you get in this optional exercise is more than directly proportional to the politeness and respect with which you approach her. Only a saint (and I've benefited from a couple) will help a hateful student.
Let's see, what else? Oh, yeahfor an introductory text on discrete math, I like Discrete Mathematics by Richard Johnsonbaugh. It's in its fourth edition, so look for an old edition, and again, don't go over ten bucks.
Good luck, and have fun!
adamsj
They laughed at Joan of Arc, but she went right ahead and built it. Gracie Allen
1. I'm jealous of tadman, since, while I would agree with him that the ideas in the various Onotations are simpler than the ideas in multivariable calculus, I found them harder to learn and prove.
 [reply] 

 [reply] 

You might check out Concrete Mathematics: A Foundation to Computer Science by Graham, Knuth and Patashnik.
From the fatbrain page: Concrete mathematics is a blending of continuous and discrete mathematics. "More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems." The subject matter is primarily an expansion of the Mathematical Preliminaries section in Knuth's classis Art of Computer Programming, but the style of presentation is more leisurely, and individual topics are covered more deeply.
 [reply] 
Re: BigO Notation by Nitsuj (Hermit) on Jul 06, 2001 at 07:49 UTC 
Eh, I'll add my $.02 (hehe).
I think that the best math in computer science is the REALLY "outlandish" stuff, that you'll NEVER see in other fields. I love toying with Automata and Computation Theory problems. Nothing like a Turing Machine to make your day complete. And that is as MATHY as it gets, though it certainly doesn't feel like math was in high school, I didn't love math so much in HS... Not because of the teachers and all, but because I was taking the area of triangles just a bit more often than I care to think of.
Just Another Perl Backpacker  [reply] 

