*
because if bits take up a finite volume (and the speed of light is constant),
*

I really can't tell if you're trying to mock the value of the kinds of theoretical claims I made, or you really think that the mass & volume of photons are a significant factor in algorithmic analysis.

You are right that there are a lot of factors when dealing with physical machines instead of theoretical ones. But on my computer, the amount of pyhsical RAM does not depend on the input size to algorithms I'm running. And in my exercise, we are fixing two computers & two competing algorithms, and varying only the input sizes to these algorithms. So I prefer to treat memory lookup time as a constant.

The point of my previous reply is that if I upgrade computers, then memory access time, CPU cycle time, etc, are each smaller constants, but still constants. Eventually, as the input sizes increase, the algorithm with the best asymptotic performance will win. And in the case of a reasonable O(1) or even O(log n) algorithm (where "reasonable" means the constants are not insanely large) on a slow machine vs a reasonable O(n) algorithm on a fast machine, the better algorithm starts winning perhaps sooner that you'd expect.

Comment onRe^5: Representing all data as Lists