http://www.perlmonks.org?node_id=760427


in reply to Re^3: elsif chain vs. dispatch
in thread elsif chain vs. dispatch

Ooops. I think I screwed up here an pushed create/update button at the wrong level. Lots below is redundant. A goof...

I don't know what is new in Perl 5.8.3 regarding new re-sizing algorithms based upon buckets used, but if you are curious as to what is happening, the scalar value of a hash, eg my $x = %hash; returns a string like "(10/1024)" showing number of buckets used/total buckets.

To pre-size a hash or force it get bigger, assign a scalar to keys %hash, eg: keys %hash=8192;.

The Perl hash algorithm is:

/* of course C code */ int i = klen; unsigned int hash =0; char *s = key; while (i--) hash = hash *33 + *s++;
Perl cuts the above value to the hash array size of bits, which in Perl is always a power of 2. As mentioned above, this "(10/1024)" string shows number of "buckets" used and total "buckets". There is another value, hxv_keys accessible to the Perl "guts" that contains the total number of hash entries.

If the total number of entries exceeds the number of buckets used, Perl will increase the hash size by one more bit and recalculate all hash keys again.

So let's say that we have a hash with 8 buckets and for some reason only one of those buckets is being used. When the ninth one shows up, Perl will see (9>8) and will re-size the hash by adding one more bit to the hash key. In practice, this algorithm appears to work pretty well. I guess there are some improvement in >=Perl 5.8.3.

Anyway I often work the hashes with say 100,000 things and haven't seen the need yet to override the Perl hash algorithm. However this does appear to point out a pitfall in pre-sizing hash. Perl starts a hash with 8 "buckets". If you start it yourself with say 128 buckets, it is possible to wind up with a lot more things associated with a hash key than if you let Perl just grow the hash on its own.

update: As a small update, I would add that I haven't found much performance difference in just letting Perl do its hash thing vs pre-sizing a hash. The hash key computation effort (which is actually very efficient as above shows) tends to get dwarfed by the input effort to get the say, 100,000 keys and the computation required on those keys!