Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Re: Validating web-site signups are humans.

by Vautrin (Hermit)
on Mar 19, 2004 at 20:29 UTC ( #338149=note: print w/replies, xml ) Need Help??


in reply to Validating web-site signups are humans.

One of the things to look for when creating a spider or robot is patterns. For instance, if you want to submit a form, and the form uses the same name and values for each field and form name, it is very easy to create a spider to submit a form. If the form names vary, you have to look for sections of the HTML which are similar. If there is variation in the HTML which does not alter the lok of the page, things get very hard very quickly. Another thing that makes spidering very hard is if a lot of Javascript is used, because there are no modules to create a web page based on what the javascript says to do.

These are some of the things that made my life very hard when a client asked for spiders which helped create a site that was a clone of AddAll.com. I would suggest using them to your advantage. Be devious. Use javascript like document.location = "http://www.newwebpage.com" to change locations in ways that a spider will have trouble keeping up with. Alter form names and the names of the input so that they contain random charachters (you can keep track of their real values in a database. Use a random key you can get from a cookie you send the user to look up what the real field names are)

Chances are, if you do all these things, people will leave your site alone. Now, granted, many programmers / hackers -- given enough time and energy -- can overcome these problems. But by frustrating your attackers it is likely they will look for an easier target to pick on.

Hope that helps,

Vautrin


Want to support the EFF and FSF by buying cool stuff? Click here.

Replies are listed 'Best First'.
Re: Re: Validating web-site signups are humans.
by Anonymous Monk on Mar 19, 2004 at 23:11 UTC
    Time and energy are completely unneccessary.Automating mozilla is trivial.
      Interesting... I've never heard of automating mozilla. Can you please give an example or reference? Also does it work with Firefox?
      Perhaps this is why some of the web sites that I have used work fine with Mozilla until I try to buy something. On one site I got a secure connection and was able to do everything with Mozilla except validate a credit card. Since I really wanted to buy the product, I called the toll-free number, which has always seemed to me to be a reasonable, albeit imperfect, alternative to otherwise broken ecommerce apps.

      I am interested in easy Mozilla automation, also. The API used for Mozilla regression testing looks complicated to me.

      It should work perfectly the first time! - toma

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://338149]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others wandering the Monastery: (7)
As of 2022-05-23 11:36 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    Do you prefer to work remotely?



    Results (82 votes). Check out past polls.

    Notices?