I'm curious if anyone has an experience in protecting
a web-based interface from being "front-ended" by others
for their own gain.
Say I spent a good amount of time and money to develop
a site that sells "ProductX". Someone then reverse engineers my HTML form->submit process and creates their
own front-end, secretly adding an upcharge to the customer.
They also change all references to ProductX to ProductY,
so I may not be able to manually search for and identify
who's screen scraping.
What I've thought of:
- Dynamically changing form element names, perhaps tied
to a Digest::MD5 hash of the session key. Might help, but
they could still guess some of that based on the provided
values of the form elements.
- Having the user type in text that matches what's
displayed in a .gif image. ( See This pretty cool node from jcwren on defeating this sort of thing ).
This bothers me, because I'm making the customer jump through hoops to buy something
- Analyze web logs to find those people taking an
odd path through the site. ( skipping intro pages ).
Turns out this isn't useful, since there so much client-side caching.
Any other ideas ? Any modules that might help me ?