Syntactic Confectionery Delight  
PerlMonks 
Comment on 
( #3333=superdoc: print w/ replies, xml )  Need Help?? 
When you say that something is O(N) or O(N^2) or whatever, you are saying that as N changes then the resource in question (time or memory normally) changes with that relation to it. So if something is O(N) and N doubles, then the time taken doubles. What this ignores is that the time taken is actually c*N or k*1, and that for different algorithms the constants will be different. So, the if/elsif/elsif/elsif/else chain actually takes c*N seconds, and the hash lookup and subroutine dispatch takes k*1 seconds. I'm not at all surprised that for small N then cN is smaller than k because it involves very few calculations so its constant term is a *lot* less than that for the hash lookup. I shall now proceed to pull some random numbers out of my arse. Calculating a hash takes of the order of a few hundreds of machine cycles (let's say a thousand). Finding the right place in the if/elsif/elsif/else chain will take more like a few tens of cycles. In fact, if the compiler optimises really well, checking and ignoring each condition will take just two machine cycles. Let's call it ten because this is perl, not C. If we also assume that on average it has to check N/2 conditions then the hash would only win for N > 200. That's ignoring, of course, any extra overhead from setting up both cases in the first place. In reply to Re: elsif chain vs. dispatch
by DrHyde

