|XP is just a number|
BTW, you *did* see that I said "none of this is new" right? So why the emphasis on "But that has always been the case"?
Because, until the simple example in your latest post, all the previous examples demonstrate things that have always been true. Thus, they do not demonstrate what changed. Which when combine with the phrasing of the OP ...
But never mind. I'm not trying to get on your case here; just work out what has actually changed, and a) how it might affect my existing code; and more importantly b) how it might affect my thought processes with regard to how I think of and use hashes.
My conclusion so far -- for me personally; not the world in general you are addressing -- is that I have assumed the "new" constraints as a matter of course ever since the randomisation fix for Algorithmic Complexity Attack that was (breifly???) implement in 5.8.1.
However, what would be most useful to me -- and others I'm sure -- is a description of what has actually changed internally; and why it has been changed. Are you up for providing that description?
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
In reply to Re^8: Hash order randomization is coming, are you ready?