|ccTLD's , Duplicate Content , and Panda. What now?|
A couple of years ago there was a short thread [webmasterworld.com...] which referenced Matt Cutts and some senior webmasters, on deploying duplicate content across multiple TLD's. To me it seemed straight forward as being OK. Google seemed to work quite well, based on observation. All seemed OK.
But now I'm in two minds. With Panda focused on UI and content, would you still strongly support duplicate content across multiple TLD's for regional ranking? Or would you reconsider your strategy in this area to totally localising the content on each ccTLD?
Depends to me on whether there's an actual regional presence. If there is, then it should be reflected in ccTLD sites. If not, I certainly wouldn't use ccTLDs to broaden a base that's in fact there only for imagined benefits of promotion.
That would be like putting up a directory of US cities on a widget site to target more local searches for widgets. It generally doesn't make sense to do that, unless there's actual local content that's really unique to the locale.
I see that with ccTLDs in the future, the bar with regards to genuine local presence will be raised higher, with things like local links and perhaps local hosting) becoming larger factors. I think that from the beginning, localization of local content on foreign sites has always been a factor.
Certain spellings might change... "glamour in the theatre" is different in the UK than here (probably beyond simply spelling ;) ...Addresses, phone numbers, etc, are obviously the lowest hanging fruit for changes.
|That would be like putting up a directory of US cities on a widget site to target more local searches for widgets. It generally doesn't make sense to do that, unless there's actual local content that's really unique to the locale. |
Well there's been a lot of that going on, especially on global brand sites with a huge amount of content. I wonder, as you say, how long that will be tolerated.
My sense is also saying that Google wouldn't really like content that has no search traffic to it to a ccTLD, where it is less relevant, and that might become, or already might be a Panda factor. Does anyone have an opinion on this too?
in most cases your content on a ccTLD will be geotargeted which essentially means google filters the urls on that ccTLD for that country's google index.
the "content duplication problem" isn't going to hurt your .com (or whatever your primary domain is) but the lack of unique, regionally targeted content will probably prevent you from ranking well in the country-specific google indexes.
btw this Webmaster Central Blog post mentions the Geotargeting factors:
|Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible for all pages and variations from the start |
Thanks Phranque - I took the bolded section to strengthen my concern, that simply serving up the same content on different ccTLD's might be considered spamming for the sake of ranking in those different regions. What Google seems to be saying is that content unique and specifically targeted is preferred.
My sense is to err on the side of caution if one is not already doing this. No idea if this quality element plays into Panda though.