I would have to examine the hash function in some detail to tell whether or not that would work as a fix (at a minimal performance loss and the space overhead of having to store the initial value for each hashing function in the hash).
Deep recursion checks in the list traversal logic is exactly the solution that I thought was likely to slow things down. Perl spends a lot of its time doing hash lookups, and any overhead (eg having to use up an extra register to track how deep the list went) is likely to show performance problems.
But thinking about the issue, I am not sure that a clever person couldn't manage to add such a check in some way that didn't cause significant performance problems. I simply have not worked close enough to the metal enough to have a sense of that. If you could add the performance check, though, there is no reason that you couldn't just fall back into a more complex and robust data structure instead of a linked list. The immediate algorithmic overhead is paid for by the fact that you only use it when you really need it.