Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses
 
PerlMonks  

Re^9: Our perl/xs/c app is 30% slower with 64bit 5.24.0, than with 32bit 5.8.9. Why?

by BrowserUk (Patriarch)
on Dec 22, 2016 at 14:07 UTC ( [id://1178367]=note: print w/replies, xml ) Need Help??


in reply to Re^8: Our perl/xs/c app is 30% slower with 64bit 5.24.0, than with 32bit 5.8.9. Why?
in thread Our perl/xs/c app is 30% slower with 64bit 5.24.0, than with 32bit 5.8.9. Why?

On the security list, someone posted (1) a short perl program which created a hash with 28 shortish random word keys (i.e. those matching /a-z{2,12}/), and then printed those keys to stdout in unsorted order; (2) a C program, which given as input that list of keys, in 785 CPU seconds was able to completely determine the random hash seed of that perl process.

Okay. Is there any chance of laying my hands on the sources for the C program?

I'd be a whole lot more impressed if the keys were a set of real (or at least realistic) headers, say something like this:

  1. But even if that could still be done in a similar timeframe -- which I think is highly doubtful -- in order to exploit that knowledge, they would then need to cause the server to generate a set of headers that provoked the pathological behaviour.

    How can an external party cause a server to generate a set of headers that are carefully crafted to induce the pathological behaviour that is the apparent root of the perceived problem?

  2. And, how many web servers would still be running that same perl process, with that same random seed 15 minutes later?
  3. And how many sites are there that run a single server process with a single persistent Perl process?
  4. And how many of those emit sufficient, short, and unsorted headers for the determination to be made?
  5. And how many of those accept sufficient input from remote user, to that same perl process such that the bad guys having determined the seed value, can construct a pathological set of keys of sufficient size to cause harm, and then persuade the process to accept those values from them and build a hash with them?

I'm just not seeing the threat landscape where such a combination of requirements will exist. And even if they did, they would be so few and far between, and on such small websites -- single servers with a single permanent perl process are basically confined to schools, charities and mom*pop stores -- that no hacker is ever going to waste their time trying to find them, much less exploit them.

In any case, my comment about "unnecessary" was little more than a footnote in my suggestion above that the OP could try reverting his 5.24 perl to using the 5.8.9 hashing mechanism to see if that was the source of his performance issue. If it isn't, one more thing to ignore. If it turned out it was, he could decide if his application was even remotely vulnerable to the "security concern" and choose to revert or not as he saw fit.


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority". The enemy of (IT) success is complexity.
In the absence of evidence, opinion is indistinguishable from prejudice.

Replies are listed 'Best First'.
Re^10: Our perl/xs/c app is 30% slower with 64bit 5.24.0, than with 32bit 5.8.9. Why?
by dave_the_m (Monsignor) on Dec 22, 2016 at 16:34 UTC
    they would then need to cause the server to generate a set of headers that provoked the pathological behaviour.
    Perhaps I didn't make it clear. It didn't require a specific set of headers. The point of the random key generator script was to demonstrate that it works with any set of headers.
    And, how many web servers would still be running that same perl process, with that same random seed 15 minutes later
    That 15 minute time wasn't optimised code - it was just proof of concept. I'm sure it could be made much, much faster. It is also parallelisable. And what you do is open a TCP connection to a server process, send one request, keep the connection open, then calculate the seed, then send a second request which DoSes the server. Also, depending on how the perl processes are spawned/forked, they may all share the same hash seed.
    In any case, my comment about "unnecessary" was little more than a footnote
    But you've spent an awful lot of time since trying to convince anyone who will listen that it isn't a security issue, and you've been shown repeatedly that the assumptions you based this conclusion on were erroneous.

    Dave.

      But you've spent an awful lot of time since trying to convince anyone who will listen that it isn't a security issue,

      Hm. Seems to me you've expended a lot of time attacking my opinion.

      All I've done is spend a little downtime politely and respectfully responded to your inflammatory comments.

      you've been shown repeatedly that the assumptions you based this conclusion on were erroneous

      No. Far from it. You've described a toy process that can brute force a conclusion when fed with a relatively large number of unrealistically short keys.

      What you patently failed to demonstrate is how that knowledge can be used to do anything bad.

      Yes, you can use knowledge of the seed to construct a set of keys that could induce pathological behaviour if used to construct a hash, but you simply omitted to address the problem of how are you going to persuade the server to construct a hash from the set keys you've generated.

      As I said, a purely theoretical problem that has never and will never be demonstrated in the wild; addressed in a clumsy and over-engineered fashion.

      But that just my opinion; it won't change a thing and seems hardly worthy of you're esteemed time to argue with; but here we are 11 levels deep.


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority". The enemy of (IT) success is complexity.
      In the absence of evidence, opinion is indistinguishable from prejudice.
        Hm. Seems to me you've expended a lot of time attacking my opinion.

        All I've done is spend a little downtime politely and respectfully responded to your inflammatory comments.

        Oh sigh. Your very first post on this topic made an unevidenced assertion that the hash-related security fixes done in 5.18.0 were both unnecessary and may have significantly slowed the perl interpreter. All I have done is patiently and politely tried to demonstrate that your assertions were incorrect. And it seemed to me to be important to correct this assertion as (assuming I was right), people could be mislead into thinking something insecure was secure.

        Somehow that is inflammatory.

        Yes, you can use knowledge of the seed to construct a set of keys that could induce pathological behaviour if used to construct a hash, but you simply omitted to address the problem of how are you going to persuade the server to construct a hash from the set keys you've generated.
        That's utterly trivial. You send a HTTP request to the server with a bunch of headers or parameters containing the generated keys. If the server creates a hash from the input, you've done it. Or you could supply some JSON to a server that processes JSON input. Etc etc.

        And remember that most of the recent discussion above has been about servers leaking the hash seed. That isn't the only (or main) thing fixed in 5.18.0. The biggie was that if you supplied a suitable small set of keys in a request (no need to know the server's seed) you could force the perl process to allocate Gb's worth of memory. Also I think there were issues with the existing way algorithmic complexity was protected against, but I don't remember the details now.

        And this is isn't just about HTTP servers. Any perl process that gets hash keys from untrusted input used to be vulnerable to algorithmic complexity attacks. Think of a spam filter that reads an email's headers into a hash for one hypothetical example of many.

        addressed in a clumsy and over-engineered fashion.
        Patches welcome....

        Dave.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1178367]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (6)
As of 2024-04-18 08:30 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found