homepage Welcome to WebmasterWorld Guest from 174.129.130.202
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / WebmasterWorld / Professional Webmaster Business Issues
Forum Library, Charter, Moderators: LifeinAsia & httpwebwitch

Professional Webmaster Business Issues Forum

    
How To Do Country Codes Of Your Site
murmy




msg:4007196
 10:47 pm on Oct 14, 2009 (gmt 0)

Lets say you have an english language site, and you have country codes in other english speaking languages. How do you stop those additional sites being considered duplicate content by google?

Eg.... You have a .com and .us and they contain some similar or same content in the english language.

 

httpwebwitch




msg:4007551
 1:34 pm on Oct 15, 2009 (gmt 0)

Not to mention co.uk, com.au, .ca, .bb, .bs, .za ... ?
Good question! This is a common concern.

I can see if you went to the trouble of creating distinct sites for different English-speaking TLDs, you would want to rank them appropriately geolocated, for instance in google.ca. There is some sketchy [webmasterworld.com] evidence that *.ca domains fare better in localized SERPs. Therefore you do want the *.ca site indexed, not 301-redirected over to the .com.

But then you're smacked with dupe content penalties.
Seems like a catch-22 to me.

Check out this previous thread exploring the same issue [webmasterworld.com], which links to another discussion [webmasterworld.com] which references several [webmasterworld.com] others [webmasterworld.com]. Lots of reading. get started :)

jdMorgan




msg:4007622
 3:51 pm on Oct 15, 2009 (gmt 0)

> dupe content penalties.

Unless there is duplication across a massive number of URLs, I see no evidence of any penalties -- Most of the talk of penalties is simply fear and unreasonable doubt. Many if not most "home pages" can be reached by no less than 8 different URLs -- www versus non-www, "/index.htm" versus "/", FQDN- versus non-FQDN-format domain, etc., so it would be folly if Google or another SE tried to penalize a site based solely on "same content at more that one URL."

They might indeed do this, but only if the duplication is massive, and there are are unmistakable indications that the duplication was done intentionally. But applying true penalties for duplicate-content would affect a large majority of sites on the Web, because those sites' owners have made no effort whatsoever to force domain and URL canonicalization -- and indeed, probably have never even heard of those terms.

What *does* happen is that the two (or more) URLs compete against each other for links and other ranking factors; If their content is very similar and competition in the target market is fierce, then this competition can naturally push both URLs down in the SERPs. It's an over-simplification, but if you split the "ranking power" of essentially the same "page" across multiple URLs, then the potential rank of that "page content" gets divided among the URLs, and must therefore be lower than it would otherwise be if the page-content was totally unique and existed at only one URL.

Point being, it's not an SE-imposed "penalty" specifically aimed at your sites, but rather the natural consequence of the fact that search engines work based on URLs, the content found at those URLs competes against all similar content on the Web, and a single-character change in what you see in your browser address bar makes it a completely-different URL.

Jim

mack




msg:4007626
 3:59 pm on Oct 15, 2009 (gmt 0)

I think the duplicate content issue here is very real. The safest option would to build each site from a clean slate so that each site is very different, although serves the same purpose.

This approach is the only real option when you develop country level sites that are based on a different language.

Another potential issue with multiple sites is ensuring the relevant site ranks in front of its intended users. For example you build the .com intended for mainly US users, and the .co.uk for UK users. What if the .co.uk outranks the .com on Google?

One way around this is to base all regional sites within sub domains of folders of the main domain. Lets say you have the main .com site. When a user arrives it gets details of the user based on their IP address and forwards them to their related country page. For example if I am from the UK and see your site in the serps. When I click through I am forwarded to example.com/uk or uk.example.com

This is quite a common approach, but does have its downsides, It could be seen as cloaking, but this is a perfect example of cloaking being used for the right reasons. I would tend not to use xy.example.com because this may be seen as a different site and be crawled in its own right, this again would cause a duplicate content issue.

As for your existing domains set them up as 301 redirects to their locations, example.co.uk redirected to example.com/uk and so on.

The question now is what do we allow the bots to crawl. My suggestion would be to allow everything on the main .com site to be spidered but block the country level folders in robots.txt except the non English folders. If you have good anchor text inks pointing to the regional folders the index page will still show up in the serps even although the page wont be crawled. It will simply be shown as a title with no text snippet. “View site name version for the UK”

This is just my suggestion, I'm sure others will have very different ideas. There is no simple approach to this issue.

Mack.

tedster




msg:4007643
 4:26 pm on Oct 15, 2009 (gmt 0)

Most of the talk of penalties is simply fear and unreasonable doubt.

Exactly. I don't hear stories or see evidence about such "penalties" actually happening, but there is an epidemic of concern from site owners right now, and that clouds the picture for many.

A true duplicate PENALTY is very rare and requires a clear-cut attempt at deception, involving other violations of the search engine guidelines. I work with several multi-national businesses and they do not get penalties for this kind of practice.

Instead of penalties, there is filtering that keeps any one business from dominating the results in any geographic area. Here's how that works.

If you have country code top-level domains (ccTLDs) with duplicate content, then only one version will be shown to searchers from each county, and the other sites will be filtered out. For example, your .za would show to English searches made from South Africa, and your .com to English searches made from countries where you do not serve a ccTLD.

It is a good idea to customize the language to the idioms and spelling that are used in each country where you have a presence. This helps your credibility and it helps your ccTLD gain both back-links and business from that country. In other words, if you want to rank well in a particular country, then make it clear that you really do serve that country.

[edited by: tedster at 7:40 pm (utc) on Oct. 15, 2009]

phranque




msg:4007727
 5:37 pm on Oct 15, 2009 (gmt 0)

these resources may be helpful:
Official Google Webmaster Central Blog: Where in the world is your site?:
[googlewebmastercentral.blogspot.com...]
Official Google Webmaster Central Blog: Better geographic choices for webmasters:
[googlewebmastercentral.blogspot.com...]

inbound




msg:4007865
 9:17 pm on Oct 15, 2009 (gmt 0)

My experiences have been that getting lots of links to the localised site from that country is the way forward, it helps the search engines decide which site is the right one. Any global links (ones that come from sites that themselves serve many countries, if you are lucky enough to get featured on large properties) should go to the .com and the .com should have links to the local sites as part of the template.

I think the way search engines see it as that if you are truly big enough to have localised sites then you should be big enough to get enough links to all of them. Once you pass an, unfortunately, undefined point you'll see the authority status clearly.

It does not matter if you only have 1 site showing in one territory, as long as it's the right one and it ranks well.

anallawalla




msg:4007885
 9:43 pm on Oct 15, 2009 (gmt 0)

The safest option would to build each site from a clean slate so that each site is very different, although serves the same purpose.

and
Instead of penalties, there is filtering that keeps any one business from dominating the results in any geographic area

Couldn't agree more, but multinationals don't often have the appetite for such things.

This reminds me of a former employer's overseas sites. They don't have as many ccTLD domains active now, but compare:

  • site:www.example.com (main site, 2580 pgs)
  • site:www.example.com.au (45 pgs)
  • site:www.example.se (9 pgs)
  • site:www.example.es (138 pgs)
  • site:www.example.dk (46 pgs)
  • site:www.example.co.uk (99 pgs)
  • site:www.example.fr (164 pgs)
  • etc

These sites have the same content, more or less. More recently they have made some attempt to use local languages in FR and ES but there is still repeated English content.

stephen186




msg:4008137
 11:01 am on Oct 16, 2009 (gmt 0)

just a thought..I have heard google does have the ability to check the Whois for the domains so if you are using the same content for each TLD and all the domains is in the same name.. then it should not be of any concern.

gn_wendy




msg:4008232
 2:16 pm on Oct 16, 2009 (gmt 0)

one way, if you have a well structured system for managing content is to set universal pages to INDEX on your main domain, and set noindex (or canonnical: preference choice) on the other domains. Pages aimed at the .co.uk audience then get the INDEX and the main page is NOINDEX for that page/content...

depends on the case, but it has worked well for me in the past.

Whitey




msg:4008513
 11:12 pm on Oct 16, 2009 (gmt 0)

These sites have the same content, more or less. More recently they have made some attempt to use local languages in FR and ES but there is still repeated English content.

If we're talking Google here , I think that the management of geo related content is far from perfect.

I've closely observed a multi national network of sites for a business and seen them remove all the erroneous EN content from their Chinese , French , German and Spanish sites. They also removed all the interlinks between sites from pages with the same content. The experiment was a disaster for traffic and sales , simply because all of that content and linking worked. There were no issues with duplicate content filtering.

Something tells me that Google is far from getting it's geo search right. However , one thing is also clear to me and that is that it totally removes the capacity of one site to dominate it's category on a multi national basis - and i guess that's what G wants to improve competitiveness. I'm not sure that it balances with good SERP's though.

lorax




msg:4008656
 12:41 pm on Oct 17, 2009 (gmt 0)

I think it's important to bare in mind there is a difference between a "penalty" for duplicate content and diluting the effectiveness of your content by duplication of it.

AjiNIMC




msg:4008947
 4:50 am on Oct 18, 2009 (gmt 0)

When I look an duplicate content, I see following issues:

1) .fr and .com can be owned by different owners, so there is no way that Google can promote .fr under french engine instead of .com

2) If .fr and .com contains the same content, why will Google index both and confuse users with same content under 2 different domains.

So the purpose is to have different language, different layout, different content to cater to the different segment.

I wish there was a way in Robots.txt to tell Google about preference of sites for different Geo location. May be it can added later.

Example:

Country in 1-example.in,2-example.com,3-example.co.in

I can see a lot of pros and cons.

jkovar




msg:4008969
 8:28 am on Oct 18, 2009 (gmt 0)

Are all sites using a global template, could you use a canonical meta element ?

phranque




msg:4009256
 6:58 am on Oct 19, 2009 (gmt 0)

the canonical link element does not support specification of urls on external domains:
[googlewebmastercentral.blogspot.com...]
[bing.com...]
[ysearchblog.com...]

seo marketing




msg:4016725
 10:49 am on Oct 31, 2009 (gmt 0)

I feel that you should produce separate content for different TLD domains. It would be the best option.

If your website has 10-20 pages then you can try creating a unique content for the homepage and use same content for the inner pages.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Professional Webmaster Business Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved