DodgeThis - 10:23 am on Dec 23, 2012 (gmt 0)
We have several folders on our site that at one time contained thin content. Everything within has since been set to return a 410, but due to the high volume of pages involved, we are looking for a better way to tell Google everything has been deleted rather than wait for it to recrawl each page.
Google advise using robots.txt to deny access to certain paths, but we have also heard that this can look suspicious, as if trying to hide thin content from them.
Most of the pages were 410ed last year after a Panda strike, the rest earlier this year. WMT is still discovering them. Recent events have caused us to investigate the possibility that Panda’s tolerance is decreasing and we are falling back into its clutches due to the perception of our site, not the reality.
We would be most grateful for any guidance from others who have experienced similar. Thank you.