Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Content Penalty Or Something Else?

         

MoogGT

10:49 am on May 17, 2006 (gmt 0)

10+ Year Member



(Sorry for the huge post but I think you need to know all the facts on this one)

We have experienced a huge drop in the SERPS this week, I suspect that we have some form of duplicate content penalty but we haven’t changed anything since late last year. Let me explain...

Up until last year we had a web site operating from XXX.co.uk. However the products and prices we list are different depending on your country. Product A in the UK might be called Product B in the US. We decided to setup a regionalised site on XXX.com with the sub domains...

uk.XXX.com
us.XXX.com
au.XXX.com
etc

The regionalised sites are the same in structure and virtually the same in content except for the occasional different product names. All prices are totally different due to currencies. We have good rankings for the site operating from XXX.co.uk so decided to leave this as the site listed on search engines and to avoid any penalties with duplicate content we blocked all robots from the XXX.com sites with a robots.txt file excluding everything.

To get new users coming in from XXX.co.uk to the regionalised site a popup window appears that asks for your location with a list of the regional domains. The popup appears over the top of the page content in a floating layer and is shown using Javascript, therefore the html code in the popup window is not part of the page content. This was all setup mid last year and seemed to work well.

In September 2005 out of the blue, I get an email from the Google Adwords team, they want to suggest some changes to the Adword Ads listed on our site. We make the changes and during email conversations I ask why we don't do very well with the regional sites and Adwords. Google reply that it is because we have robots.txt blocking them all. To which I reply that will infringe duplucate content rules though? Google think not so robots.txt is lifted some time during September 2005.

Nothing has changed much since then; listings for XXX.co.uk have been stable for our keywords, sometimes no.1 but usually around position 2-4, but always on page 1.

Then this week I notice a HUGE drop in traffic. Investigation reveals that referrals from Google have dropped significantly. We have never had that much traffic from MSN or Yahoo but these seem about the same. Checking one of our best terms I find that our top 3 position is now relegated right to the bottom, on page 180. I've checked other main terms and its a similar story, our primary domain XXX.co.uk is now at the bottom. What is strange is I can still find more specific terms with a good position so it only appears to be affecting certain pages, not the site on the whole.

Does anybody have any thoughts on this? Is this a duplicate content penalty or something else?

jonrichd

1:00 am on May 18, 2006 (gmt 0)

10+ Year Member



MoogGT, Welcome to WebmasterWorld [webmasterworld.com]. I can't be absolutely sure what is causing your problems, but I'll be happy to venture some suggestions.

First of all, when multiple domains show the same content, typically one domain will have rankings, and the others won't show up. This would argue against the duplicate content problem, since one would think that your main .co.uk domain would rank well, and the other domains would not.

However, you have thrown an interesting wrench (spanner?) into the mix - the use of subdomains, rather than separate unique domains. Subdomain spam has become an issue in recent months, and it's possible that you are caught in a new filter that didn't exist back in the fall when you had your email exchange wtih Google.

Another possibility is your use of Javascript to (indirectly) redirect users to a different location. Google has been penalizing sites that use Javascript redirects to take the user from a doorway page to the desired page. I know that's not what your'e doing, but possibly it now looks that way to G.

A third possibility is that you're doing something not quite white hat, and you've been caught. You might want to get a Google Sitemaps account, and see if that tells you anything.

Finally, it might just be a Google temper tantrum. I've had sites that have had similar experiences as yours -- dropping from the top ten to infinity, except for some very specific searches -- that magically regained their positions after months, without my having to do anything.

I know this doesn't give you the absolute answer that you were probably hoping to see, but I hope it helps.

MoogGT

8:45 am on May 18, 2006 (gmt 0)

10+ Year Member



Thanks for your reply, to fill in the blanks...

>when multiple domains show the same content, typically one domain will have rankings, and the others won't show up.

This is exactly what happened from September until now.

> the use of subdomains, rather than separate unique domains. Subdomain spam has become an issue in recent months, and it's possible that you are caught in a new filter

All our sub domains operate on a different IP address but within the same subnet. They all have a different logo with different filename, and certain parts of the page template are jumbled around, therefore there is never 100% duplication.

> Another possibility is your use of Javascript to (indirectly) redirect users to a different location. Google has been penalizing sites that use Javascript redirects to take the user from a doorway page to the desired page.

We only use Javascript to show the popup, we dont force the links to be followed. The links themselves are straight html text links to a forwarding page on the .co.uk site. The links do not point directly to the sub domains. To clarify...

1) User arrives at XXX.co.uk, the page has the same content as googlebot crawls.
2) Javascript is used to add a popup window (in a div/layer) with a list of countries.
3) User clicks on the country they are in.
4) The link clicked goes via [XXX.co.uk...] and forwards the user to the same page on XXX.com. This forwarding page is blocked using robots.txt.

I'm not trying to deceive Google or anybody with this method. The only purpose is to get the user to the regionalised version of the page they requested.

> A third possibility is that you're doing something not quite white hat, and you've been caught. You might want to get a Google Sitemaps account, and see if that tells you anything.

Nothing new to report in sitemaps, still the usual (non critical) errors reported. It still says that all our pages (including the XXX.com domains) are in the index.

> that magically regained their positions after months, without my having to do anything.

Months?!?!? The site would be dead and buried by then :(

Thanks for your answers, but if anybody has any further thoughts please share them.

MoogGT

8:08 am on May 19, 2006 (gmt 0)

10+ Year Member



I thought I'd post an update incase anybody is seeing a similar situation.

All has gone back to normal today with ranking in the SERPS. I haven't changed anything all is exactly as it has been since September. So my only conclusion is that Google was reindexing our site which took a few days.

I have been watching the number of pages on Google using the Site:#*$!.co.uk command and it has been rising by the day.

A huge relief as I'm sure you can appreciate, so many of us rely on Google for good traffic. Days of no Google like this is certainly a wake up call!