---- Should I disallow search results pages from my website?
tedster - 4:46 am on Sep 29, 2012 (gmt 0)
Is it bad to have over 3,000 pages that are blocked by robots.txt.
Not at all. In fact, since a robots.txt Disallow rule acts like a pattern match (it means "do not crawl an URL that begins like this") even one Disallow rule can block a potentially infinite number of URLs.