|Pathologically Eclectic Rubbish Lister|
Re^3: Truly randomized keys() in perl 5.17 - a challenge for testing?by BrowserUk (Pope)
|on Oct 01, 2013 at 16:25 UTC||Need Help??|
In demerphq's own words:
So, not for performance (gain) reasoning.
Also in his own words:
demerphqPersonally I dont think its worth the effort of doing much more than thinking about this until someone demonstrates at least a laboratory grade attack. IMO in a real world environment with multi-host web sites, web servers being restarted, load balancers, and etc, that simple hash randomization is sufficient. Seems like any attack would require large numbers of fetch/response cycles and in general would just not be effective in a real production environment. I would assume that the administrators would notice the weird request pattern before an attacker could discover enough information to cause damage. Same argument for non-web services IMO.
And it doesn't make anything more secure.
And Dave_the_m said:
Which mirrors the OPs objections.
So, significant consequences
So, why? What did Perl gain?
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.