Welcome to WebmasterWorld Guest from 188.8.131.52
The only use I've found of adding the trap page to robots.txt is to prevent real SEs falling into them.
I don't actually link to that page any where in my web site, to prevent trapping real visitors.
A trap is good when undetected
Dynamic bot trap...
IMO if you truly want to see a see a security improvement to all this, your best bet is to push the browser vendors to improve their s/w. To give at least an option to users, to completely switch off retrieval of third party resources (like they do already with third party cookies).
The RequestPolicy extension for FireFox does exactly this
IMO if you truly want to see a see a security improvement to all this, your best bet is to push the browser vendors to improve their s/w.
In order for the Flash or Iframe to be useful it has to be present on the site that wants you to ban the user(competitor) at minimum, thus someone really interested in your traffic...
link to badrobot IS in robots.txt
so to follow this link it 1) has to be a robot and 2) not read or ignore robots.txt
Oh so that's what it means to have a dynamic robots.txt
...experiments with writing a weblog in a text file usually read only by robots...