Forum Moderators: open
However, this is untidy, therefore I propose a very simple exclusion protocol just add
?...&robots=nofollow
to the url.
When a robot sees this parameter in an url it should not follow it.
The standard should allow other fields and fields in any order so that the following would be legal
?...&robots=newparam,nofollow,anotherparam
This would make it easy for webmasters to avoid setting spider traps. It would allow creators of shopping cart software to ensure that their products don't set spider traps. Since it is probably the existence of such problems that has caused some hosts to ban Googlebot (amongst others) it would help to solve this problem over time.
A standard such as this should have been agreed years ago. However, if we wait for a standards organisation to ratify this it'll take years. On the other hand, if Google were to unilaterally adopt such a standard, other robots would adopt it too.
Kaled.