Sgt_Kickaxe - 7:43 pm on Sep 2, 2011 (gmt 0)
they are really showing their true colours el-rapido these days.
I agree, 110%. I'm sure Google has a plan in place for when webmasters revolt and say hey - stop making billions off my content yo!
Ignoring robots.txt directives sounds like grounds for a lawsuit, why should you foot the bandwidth?
Worse is that analytics and some web hosts hide googlebot activity, as if they know it shouldn't be there...
update: from John Mu
Just to follow up on the Instant Preview questions.. We fetch the content for Instant Previews (provided it's not cached yet) on demand when the user requests it. When we do that, we need to be able to fetch the page the way that the user would see it, and for that we may fetch content that's otherwise disallowed by the robots.txt file.
Yup, lawsuit incoming. The explanation doesn't hold water since it should not be possible to request an instant preview on a disallowed page. Chicken before the egg problem John.