Forum Moderators: Robert Charlton & goodroi
After the 22nd of july and 28th of july massacres, many sites were affected by the new google dupe filter algorithm. A few of my sites were banned for that caused and failed to be reincluded in google index after sending a reinclusion request.
My supposed-to-be duplicated contents, in the eyes of google, are actually very usefull for my sites visitors.
Now, Will forcing googlebot not to crawl/index those duplicated contents using robots.txt make my site(s) complying with google webmaster guidelines?