menntarra_34 - 10:45 pm on Nov 28, 2012 (gmt 0)
well then what is better when you deal with pagination on a page, where all other pages are nearly the same(so can be judged as duplicates): block with robots.txt or use noindex tag, or both?
And back to tedsters opinion: in this situation i can't think that noindexing thousands of these duplicate paginated pages have any affect on ranking of the other pages of the site. While they still be useful for users who are looking for somethine. For example i'm talking about a search of a website, where you can search for articles, and the result is of course paginated.