Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re: hash collision DOS

by crazyinsomniac (Prior)
on Jun 01, 2003 at 13:07 UTC ( #262194=note: print w/ replies, xml ) Need Help??


in reply to hash collision DOS

Any ideas on workarounds and fixes to reduce the risk of being DOS'ed ?
Yeah, avoid doing $query->Vars() or Vars() except when testing.
Don't just get everything, get only what you need.
Aside from that it's a non issue.

 
______crazyinsomniac_____________________________
Of all the things I've lost, I miss my mind the most.
perl -e "$q=$_;map({chr unpack qq;H*;,$_}split(q;;,q*H*));print;$q/$q;"


Comment on Re: hash collision DOS
Select or Download Code
Re: Re: hash collision DOS
by kschwab (Priest) on Jun 01, 2003 at 13:16 UTC
    It's not just dumping a hash structure that causes it. Solutions would including things like limiting the total number of hash elements, or perturbing the input data in a less predictable way.

    The white paper is a bit short on details, but I'm not sure I'd characterize it as a "non-issue".

    Update:See this for more detail and example exploits.

        Agreed. I do find it interesting that the authors of this white paper chose also chose to use 10,000 inputs to trigger the behavior. Hmm...
        The problem is that the attacker is generating the strings that go into the hash table. He chooses strings that collide and produce the worst-case performance. The worst-case performance isn't likely in normal use, but is easy for a malicious attacker to construct the strings.

        There are some limits on how many strings can be inserted by an attacker. CGI.pm limits POST sizes. With a 1 MB limit and 10 bytes per string, that is 100,000 strings all trying to go into one hash bucket. Instead of taking a fraction of a second to parse, it takes the web server thousands of seconds.

        It would be possible to add checks to prevent this attack. One easy way is limit the number of parameters in CGI.pm. 1000 is probably a reasonable limit. The proper solution is change the Perl hashing function so it isn't deterministic. If the attacker can't predict the behavior, they can't the worst-case strings.

Re: Re: hash collision DOS
by QwertyD (Pilgrim) on Jun 02, 2003 at 04:53 UTC

    But doesn't CGI store fields internally as a hash?

    From CGI 2.81:

    sub param { my($self,@p) = self_or_default(@_); return $self->all_parameters unless @p; my($name,$value,@other); #~~~~~Snip~~~~~ ($name,$value,@other) = rearrange([NAME,[DEFAULT,VALUE,VALUES]],@p +); #~~~~~Snip~~~~~ return wantarray ? @{$self->{$name}} : $self->{$name}->[0]; }

    It also looks like CGI::Simple does the same:

    From CGI::Simple 0.06

    sub param { my ( $self, $param, @p ) = @_; #~~~~~Snip~~~~~ return wantarray ? @{$self->{$param}} : $self->{$param}->[0]; }

    How do I love -d? Let me count the ways...
        It doesn't have anything to do with the Vars() method. The worst-case performance of the hash slows down insertion of elements. CGI.pm inserts parameter names as keys when parsing the query string.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://262194]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others rifling through the Monastery: (8)
As of 2014-12-27 02:14 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (176 votes), past polls