Forum Moderators: Robert Charlton & goodroi
Most of the sites have elements and pages kept deliberately out of Google as I believe what you don't let Google see is almost as important as what you do
My next step to is to figure how to SURGICALLY deny Google from probing specified webpages. I do not want to do a noindex header on those webpages because I do want Bing and Yahoo to index those pages while denying Google access to those same webpages.
Do you have any suggestion on how to tackle this?
[edited by: indyank at 6:39 am (utc) on Jun 5, 2011]
[edited by: indyank at 6:58 am (utc) on Jun 5, 2011]
Two sites i.e. DI and cultofmac were initially picked by panda but were dropped later on.It does suggest that they had signals for panda to trap them but they probably removed those signals later on. It could have been done with some help, though they may not obviously share them, without requiring google to make exceptions for them.I personally believe that they were manually white-listed for Panda. Google denies that they were hit by Panda but it would not look good for them if they admitted it. Nevertheless, one thing is clear: there's no way for them to have fixed and come back in a few days. For one, it takes Google quite a while just to index tens of thousands of pages.
netmeg, do you add "noindex" meta tag to the example pages you had mentioned above or do you add a "noindex, nofollow" meta tag or do you block them via robots.txt?