I really don't want to beat this thread into the ground, but...
*It is reasonable to assume that some inputs will vary from 1 to very large. I do not agree with you that it is reasonable to assume that input numbers will in this case. There will be some interesting size of number, and not much variation around it.*
My point is not that one *must always* consider input sizes from 1 to infinity. It's completely reasonable to only be interested in finding the most efficient algorithm for a specific range of input sizes. But you originally said that in such a case, the running time of a polynomial algorithm would be constant, as if it were a meaningful statement. With inputs restricted to a fixed interval, it is the case that (with very few exceptions) *any function is big-O of any other function*, so big-O language is meaningless. (Although you didn't explicitly mention big-O, it is implied when you say that the running time is "constant." If this isn't what you meant, then fine.)
If one is really only interested in a specific range of inputs (which is fine), one should use actual running time numbers for comparison (which are comparable and therefore meaningful), not asymptotics (which are all equivalent when not being considered in the limit).
| [reply] |