Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Excluding duplicated content from being spidered/indexed

by googlebot via robots.txt. Does that help removing the ban?

         

moftary

5:51 am on Aug 9, 2005 (gmt 0)

10+ Year Member



Hello,

After the 22nd of july and 28th of july massacres, many sites were affected by the new google dupe filter algorithm. A few of my sites were banned for that caused and failed to be reincluded in google index after sending a reinclusion request.

My supposed-to-be duplicated contents, in the eyes of google, are actually very usefull for my sites visitors.

Now, Will forcing googlebot not to crawl/index those duplicated contents using robots.txt make my site(s) complying with google webmaster guidelines?