TheOptimizationIdiot - 1:40 am on Apr 16, 2013 (gmt 0)
Personally I've had ranking drop over a week's time blocking 100 or so URLs then recover after removing the disallow.... twice
That's interesting. Thanks for sharing!
I've noindexed 1000s of pages at a time* and had better traffic than by not using it, in fact, I use it frequently as a "tool" to help get the right pages ranking in the right places.
I would not have thought a robots.txt block would have the opposite effect.
* Including currently, and unlike the robots.txt situation, traffic is increasing over when it was left to Google to decide which pages should be included. I didn't "have the final say" for a while and was told to include them, but when I "got control back" one of the first things I did was go back to "strategic noindexing" and the results have been very positive since they've started getting the right pages in the right places in the SERPs. I'm really surprised at the difference seen using robots.txt, and it's good to know there is one.