in reply to A short meditation about hash search performance
You appear to have a linguistic confusion about bigO notation, followed up by a common misconception of what O(1) means.
A function f(n) is O(1) if there are constants K and N such that for all n>N, f(n)<K. Plenty of functions other than straightforward constants meet that definition.
An algorithm is not bigO of anything. Only functions are. When it comes to hashes, the following three statements can be correct:
 The average hash search performance is O(1).
 The worst case hash search performance is O(n).
 The average hash search performance is O(n).
The first statement is why people use hashes. On average  and usually we are average  they are fast. The second is what you were pointing out as a correction explaining why the first is wrong. It doesn't correct it, it is an entirely distinct point. The third statement is true because bigO notation is only about the existence of upper bounds. I point it out to mention that common use of bigO notation among hackers is distinctly different from what you will find in Knuth and other official references.
I should note that technically speaking, Perl's hash performance isn't bigO of anything. Perl's hashing algorithms break down for datasets that do not fit in current memory and cannot be addressed by 32 or 64bit pointers (depending on the platform). I would have to look, but I think that the hash function won't scale beyond a billion or so buckets. Good luck finding someone who cares though.
Silly trivia. Following a pointer is not really O(1). The amount of work taken depends on how long the pointer is, and therefore the size of the possible address space. People only notice this when they are on a platform where they have the choice of working with two different sizes of data representation, like 16 vs 32 bit. Or 32 vs 64 bit. Going to the larger size brings with it a necessary speed hit.
Re^2: A short meditation about hash search performance by Anonymous Monk on Sep 09, 2007 at 23:13 UTC 
Man you were so hard, I think he meant the functions behind the algorithms responded to O(log2(n)) asymptotic performance. The big o notation has many uses in other fields but in computing is used with its basic meaning, that's to remark the asymptotic character of the expression enclosed for the function being referred.
Man it seemed you were dying to get the book out of the shelf and spit out some theoretical speech. I think you got the point first time, didn't you?
Peace  [reply] 

There are so many things to respond to here that I have no idea where to begin. So I'll list them randomly:
 Where do you get the O(log2(n)) from? With a flat memory model, hashing algorithms are not O(log2(n)).
 Reread my post and you'll see that I explicitly acknowledge the fact that hackers do not exactly use bigO notation in the way that Knuth did. Why do you think that he said otherwise?
 Reread the root post and you'll discover that pg both misunderstood bigO notation as used by Knuth, and as used by hackers. As far as most people are concerned, a hash lookup is O(1). That he thought otherwise was due to a misunderstanding on his part about what bigO notation means.
 Reread the root post and you'll find lots of incorrect attempted pedantry. When someone tries to get pedantic, I think it is fair and reasonable to be pedantic back.
 [reply] 
