in reply to CPU cycles DO NOT MATTER!
My thoughts on this mostly echo Kyle's, but there is another point to bring up: significance.
Benchmarks with tiny differences often aren't even significant. That is to say, that 3 millionths of a second you gain in your example might not just be tiny, but not actually reliably exist. I'm searching now for, but cannot find, an old node here that showed that trivial and seemingly unrelated changes to the source bumped the results around by a few percent.
As far as the original point goes, CPU cycles almost never matter. But there do exist cases where swapping one algorithm for another can offer you a big speed difference in a place where it actually matters.
I think my own optimization decision making gets summed up pretty well by the following:
- Is it slow?
- Should I care that it is slow?
- Is there an obvious fix? (I.e. am I accidentally iterating over a whole data set when I could drop out on the first success or failure, etc.)
- Will new, faster hardware be in place by the time I finish this thing, anyway?
- Can I buy the problem away by throwing hardware at it?
- Okay, guess it is time to optimize.