dickbaker - 3:50 am on Apr 7, 2011 (gmt 0)
maximillianos, I'm noindexing pages that are clearly thin. Some pages I've found are almost anorexic. I'd forgotten they existed.
I'm only noindexing for Googlebots. I thought about using robots.txt, or even removing pages altogether, but I can think of a reason why noindexing might be preferable.
Google is a business, and it has competitors. If there's some poor quality pages on a site, Google doesn't want to serve them up to their users. However, it's to their advantage to have Bing and Yahoo serve up what Google thinks are poor quality pages. If I'm correct about that mindset (and I'd certainly think that way about competitors), then noindex would be the way to go.
In the end, all of the theories and guesses and hunches being posted here are just that. We have almost no evidence that anything works, and chances are it will be quite some time before we do, and even then we won't be certain if we know what caused the change.