Msg#: 3907157 posted 3:11 pm on May 5, 2009 (gmt 0)
Have been working on a project about "blue widgets" with an 8-years-old rusty CMS with tons of duplicate content. My first task as SEO for this project was to remove all duplicate content as I've been having troubles with other projects I am involved.
All of a sudden, once the duplicates were out, traffic fell dramatically and I could not believe my eyes.
Left it for nearly a month in hope to have the site re-indexed properly, as I did some architectural and structural changes and nothing.
Last week I re-included all duplicates by changing my robots.txt file and traffic surged up to normal levels again!
Isn't that weird? Has anyone experienced anything similar in the past?
This is an 8 years old "authority" site in its specific country ranking for every generic "widget" keyword I could come up with.
Msg#: 3907157 posted 6:59 pm on May 5, 2009 (gmt 0)
I have seen major traffic drops when people block the url version that is the most prominently indexed. You might get better results if, instead of blocking, you 301 redirect the duplicate urls to the preferred canonical version -- or use the new canonical meta tag.
Msg#: 3907157 posted 7:23 pm on May 5, 2009 (gmt 0)
Wish it was that easy... It is currently impossible to redirect with current site CMS. I have also added canonicals and put google to not index urls with parameters, but this reduced the traffic to the lowest levels in years.
For the moment I am leaving it with duplicates and have asked for a new cms... Just wanted to showcase it here cause is the first time I come across this issue.