I would like to amend my
original post [webmasterworld.com] to pose the following -
Rather than disavowing sites through the various SE’s webmaster tool sets, why not simply put the information in a single location; such as the robots.txt file? I know this will be like moving mountains, but read on.
If there was a robots.txt rule, which when a bot see's the following in the robots.txt file:
User-agent: (Robot name)
Disallow: www.example.com
Disallow: www.example.com/page1.htm
When the bot sees this it will notate the domain and url, under the directive that this domain and this URL have been dis-avowed by the domain which hosts the robots.txt file. I will leave it up to the search engines to figure out how they want to handle this notated data.
The robots.txt file is advisory information. Shouldn’t we have the ability to further advise the bots when they “read” the robots.txt file?
We advise them on which pages/directories
NOT to follow on our site, why not also let us advise them about the links from other domains to our ours?
Realizing this may be like moving mountains to get the robots standards changed, I wanted to bring this idea to this forum for discussion.
This will be well worth the effort for the search engine’s bots, plus it’s a single location for webmaster’s to go to, to make this happen.