Okey i searched google for 'googlebots' and i realized that its a servgice google use to collect data among websites and their contents.
Yesterday though i saw this entry in my log:
crawl-66-249-66-138.googlebot.com 27 Jan, 20:35 *Backwards Directory Traversal* attempted => ../somefilename.ext 3
I was suprised seeing that the robot tried to pass an argument requesting a file called 'xomefilename.ext'
I consider it normal for a google bot to just visit web sites so to log their existance but isnt it really weird to actually try to Backwards Directory Traversal my site giving from the url false values to variable select?
Is there a chance some visitor to actully use googl bots to try this stuff?
Please tell me what you think on this case.