Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Crash-Test Dummies: A Few Thoughts on Website Testing

by sundialsvc4 (Abbot)
on Oct 08, 2015 at 19:34 UTC ( #1144229=perlmeditation: print w/replies, xml ) Need Help??

In the last few days, an article was published, referenced in another thread, which made Perl look bad.   (The problem had to do with processing CGI parameters, and list vs. array context.)   But, actually, this is a very common problem with many ... dare I say, most? ... web sites.   The problem is that, when subjected to a barrage of nonsensical (or contrived) inputs, many sites fail disastrously and repeatedly.

Many web sites (and a growing number of mobile apps, with regards to their host interfaces) are, simply, untested with regard to their ability to operate in any path that deviates, however slightly, from what the site’s designers “expected” it to do.   (Maybe) they ran through all the buttons and screens.   (Or, maybe not.)   But what they never did was to seriously beat it up.   Unfortunately, that’s exactly what rogue attackers will do.   (Yes, one of the first things that they will do is ... test your code.)

HTTP, leave us not forget, is a completely stateless protocol.   The server receives a request consisting of a URI (including any GET parameters), a set of cookies, and possibly POST parameters.   It produces an HTML (or AJAX) result, and promptly forgets.   Everything.   Well, the thing that many developers completely overlook is that these inputs can be anything.   Although you may expect that they consist only of what the corresponding <form> would generate, you must never assume this.   The inputs to your web page ... can ... be ... anything.   Therefore, attackers (or, testers) will subject your site to URIs in unpredictable sequence with unpredictable or deliberately-altered parameters.   They will look to exploit things that you assumed.   And usually they will find them.

Consider what happens if the rogue simply duplicates a GET string.   In many cases, the variables become multi-valued ... arrays ... since there is now more than one value for the variable.   Or, what if the rogue adds another variable to the string, such as &is_admin=1?   (You’d be shocked how often that turns you into a god ...)   Mispelin a variable-name can cause an error-dump that might include literal SQL text and server-IP addresses.   And so on.

One of the very first things we do at Sundial, with any new web site that we are asked to evaluate, is to put it into a test-bed and “hit it with both barrels.”   I can count on the fingers of one hand how many sites survived the onslaught entirely unscathed.   And yet, the automated test-tools are readily available (or, can be built in Perl).   The builders of the site could have made the thing bulletproof ... they simply didn’t bother.

Perl has many unique features, like “taint mode,” that if properly used can greatly strengthen your software.   But these are only as good as your ruthless and unforgiving test plans.   Don’t send your software out into the world unprepared.

  • Comment on Crash-Test Dummies: A Few Thoughts on Website Testing

Replies are listed 'Best First'.
Re: Crash-Test Dummies: A Few Thoughts on Website Testing
by Anonymous Monk on Oct 08, 2015 at 22:12 UTC

    sundialsvc4: In the last few days, an article was published, referenced in another thread, which made Perl look bad. ... stuff I retyped ......

    2014 wasn't in the last few days

    For anyone else interested in the topic ignore flushells above, go directly to Open Web Application Security Project (OWASP)

Re: Crash-Test Dummies: A Few Thoughts on Website Testing
by stevieb (Canon) on Oct 12, 2015 at 19:20 UTC

    It appears my post here is related; disregard if not.

    Years ago, I wrote an accounting application for an ISP. It used CGI::Application. I forced the system to use only one entry point and to error out on any URL passed in that wasn't from this entry point.

    I then implemented CGI::Application::Plugin::LinkIntegrity to protect against rogue URL input, after entry has been gained.

    Although specific to CGI::Application, it wouldn't be hard to drum up something similar for any other module or even custom code. This, along with -T and other routine validation should make things a bit more resistant to tampering.

      "Tamper resistant" doesn't ring the same as "tamper proof" from the LinkIntegrity doc but I don't know the software. Relying on referring URIs, if that's what you meant by "one entry point," is completely insecure.

    A reply falls below the community's threshold of quality. You may see it by logging in.
    A reply falls below the community's threshold of quality. You may see it by logging in.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlmeditation [id://1144229]
Approved by atcroft
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others examining the Monastery: (5)
As of 2019-12-10 01:46 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?