Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Re: hash collision DOS

by Jenda (Abbot)
on Jun 02, 2003 at 10:52 UTC ( #262345=note: print w/ replies, xml ) Need Help??


in reply to hash collision DOS

Well maybe CGI.pm and/or CGI::Lite could let us restrict the CGI parameters that are accepted and stored and throw away all others. Why should the $CGI object remember the parameters we are not interested in anyway?

Jenda
Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
   -- Rick Osborne

Edit by castaway: Closed small tag in signature


Comment on Re: hash collision DOS
Re^2: hash collision DOS (CGI.pm protection)
by Aristotle (Chancellor) on Jun 02, 2003 at 21:19 UTC
    1. See the fine manual: you can already ->delete() parameters, so just grep unrequested parameters out of ->param() and dump them in the bit bucket.
    2. All webservers have a relatively tight maximum size for GET requests. (I think the default is something like 4kb for Apache.) You can set $CGI::POST_MAX for POST requests.
    Use those well and it shouldn't be possible to dump enough data on a script to slow it down significantly.

    Makeshifts last the longest.

      Calling delete would happen after the problem has already occured. I concur, if $ENV{QUERY_STRING} length bothers you, simply cut it down (same goes for POST_MAX).

      I do feel a nice addition would be a something like

      acceptOnly( thesekeys => qw[ these keys ] ); acceptOnly( thismanykeys => 44 );
      This would be trivial to add ... just a thought


      MJD says you can't just make shit up and expect the computer to know what you mean, retardo!
      I run a Win32 PPM repository for perl 5.6x+5.8x. I take requests.
      ** The Third rule of perl club is a statement of fact: pod is sexy.

      PodMaster is right. ->delete() comes too late. And even the $CGI::POST_MAX doesn't help much.

      Imagine you have a file upload script. There you need to keep the $CGI::POST_MAX rather high so they may be able to post quite a few CGI parameters. And then even the creation of the hash that CGI.pm uses to store data may take a lot of time. And the grep and delete would only make the issue worse.

      Jenda
      Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
         -- Rick Osborne

      Edit by castaway: Closed small tag in signature

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://262345]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (15)
As of 2015-07-01 18:06 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The top three priorities of my open tasks are (in descending order of likelihood to be worked on) ...









    Results (16 votes), past polls