I have the same problem on my own sites. I've added a blackhole that seems to, at least partially, mitigate the problem.
- Added an entry in my robots.txt:
User-agent: *
Disallow: /secret/bla.php
- Added a hidden link to the main page to an inbetween page that is never visibly linked anywhere. This mainly prevents that annoying pre-loading in chrome triggers anything.
- The inbetween page links to /secret/bla.php
- All IPs of clients navigating to /secret/bla.php get firewalled.
It's not perfect, but i was able to reduce bot traffic (no matter what UserAgent was set) by roughly 50%-80%. It's only a temporary win in the war against china and the silicon valley crowd, of course. But every little bit helps.