indyank - 3:37 am on Sep 3, 2011 (gmt 0)
robots.txt has been designed to block some or all the disciplined bots from crawling any page.
Why should a recently introduced +1 button take precedence to jeopardise what has been well understood by all? I do agree that one shouldn't use the button on a page they don't want to share publicly. But, any good bot should obey the most restrictive instruction when there are conflicts.
When you have two robot meta tags on a page by mistake, one telling the bots to "index" and the other telling the bot to "noindex", doesn't google say they will apply the most restrictive tag? Why should it be different in this case?
I sincerely feel that it would be better if they apply the same logic here.