Forum Moderators: Robert Charlton & goodroi
e.g.
French Widgets > Content to appear in .co.uk / .fr / .de /.com / .ca
South African Widgets > Content to appear in .co.za / .de / .com
Any ideas anybody?
Basically - i suspect Google is applying a duplicate content filter extending to the suppressing of local results on multiple TLD's when there is no logical reason why they should not be allowed to rank. This is based on some sites that I'm observing, which are seemingly "victims" of this behaviour by G.
Chris R said this : [webmasterworld.com...]
Tough question - .....If you have genuinely unique content on each tld - then you will probably be ok. The amount that has to vary is hard to say, and it isn't a set amount - like you said - if there is no competition - what can you expect.
I don't think anyone can answer your question exactly, but from a theoretical point of view keep in mind that google likes to look at think from a users point view.
and from a users point of view many would like to see those results at a local level using a local TLD.
LifeinAsia said : [webmasterworld.com...]
Google limits each site to 2 listings in the SERPs. So if no duplication penalty existed, everyone who had the #1 & #2 slots could just clone their site 4 times and monopolize the entire 1st page of results, with all the links showing the exact same content. This is worthless to users, so would quickly devalue the results from Google.
On the example I'm looking at I am not seeing 1 result ranking on local TLD's which is duplicated on another TLD. So i suspect Google is filtering all the other TLD's , and effecting regional results as well.
So, if the content has no competition, and it is of value to the user, it should rank high regardless of whether it duplicates content on another TLD, since it will only show one result locally. But i am seeing those results heavily filtered.
Chris R rightly says [ IMO ] , this is a "tough question" that "nobody can answer", but I think Google needs to explain how it is applying the duplicate content filters to multi TLD's and regional results, for the benefit of siteowners managing and producing better regionalised results.
Hopefully, this makes my concerns more clear.
[edited by: Whitey at 1:36 am (utc) on May 16, 2007]
Very interesting! Does this mean Google's filtering is broke with this function? It looks like it.
[webmasterworld.com...]
Google is smart enough to sort this out and be visible in the proper respective countries.
I've been present when some of those assurances were offered, and read same also. Unfortunately I have not seen really good solutions to the issue, and certainly 'trusting' the SE algo's to get it right with largely undifferentiated content is NOT one of the best choices in this case. Plus while G has at times indicated as RC says, at other times they've acklowledged issues in this area.
One of the common issues that comes up with the international sites I've been working with lately relates to this question, and the reality is, there are lots of problems with how this issue is getting sorted on the SE site of things. One version ranking and the other not, and duplicate content affecting both sites, are common complaints.
There is no silver bullet in this areas AFAIK, but there are some options that can increase chances of ranking well:
1) Stick with domains that are country specific. Those .com's were interpreted by the U.S. community as being somewhat U.S.-centric, but that is not true. They are international in nature and should be treated as such.
2) The best way to avoid issues -- if your choice is a .com plus one or more country specific domains -- is to offer sites in different languages. One current client of mine has a Spanish language site with a local domain, and a .com in English language. They link conservatively between the two, or hide links on subpages where flipping between language pages makes sense, and have no problems
3) Sites that contain few differences in language and page contents between local country domains and .com's are, like it or not, subject to potential issues. I'll be thrilled when that is no longer the case, but so far I don't see reliable SE interpretation in this area.
4) Certainly, getting local Web hosts helps mitigate the chances of issues. Put the .ca site on a Canadian host. Put the .com in the U.S. or in this case perhaps the UK.
5) If you can add lead-ins to the pages that are local or country specific that helps too, as does the use of local country address on the bottom of each page of the site
6) Any other things you can do to differentiate are also helpful. Language differences; promotional items that can appear sitewide, etc.
Despite the good tips i think this is basically saying the handling of duplicate content filters on multiple TLD's is not working .
[edited by: Whitey at 9:18 am (utc) on May 16, 2007]
A .com site in Italian on google.com and .it [ pages in Italian ] and every other language we observe appears OK on those 2 search filter options. In our case we were not concerned with a more localised TLD presence just at this stage.
But try this with similar / duplicate content on .com / .co.uk / .com.au / .za / .ca and there appears to be only one site functioning [ very well ]. And in the light of Caveman's comments, I'm highly suspicious that the filters may not be working properly.
[edited by: Whitey at 9:42 am (utc) on May 16, 2007]
The hosting for all is in the US.
This could be one of your problems.
Also, where are the inbound links coming from? You might be sending confusing signals to Google by hosting all of these sites in the US, and then, say, linking to all of them from the same pages... or cross linking them.
In addition to the country-specific domains, in the case of English language sites, hosting locations and separation of inbound link sources can be key.
I suggest that the US site should have US hosting and primarily US inbounds, and that the others have geo appropriate hosting (or as close as you can get) and a predominance of inbounds from local tlds.
I'd also recommend minimal cross-linking between the sites (or, say, javascript linking).
The links for one of the sites that is filtered has a healthy batch of long established IBL's from it's corresponding country region - but it is filtered out.
None of these sites are interlinked, but Google should be aware that there is a relationship with the sites, due to the same ownership, same Google webmaster account , same domain name + seperate TLD's, same C block.
I'm basically reiterating that there appears to be problems which Caveman raised per above and here repeated :
Caveman - per above quote -One of the common issues that comes up with the international sites I've been working with lately relates to this question, and the reality is, there are lots of problems with how this issue is getting sorted on the SE site of things. One version ranking and the other not, and duplicate content affecting both sites, are common complaints.
Some more voices from affected folks [ happy or unhappy ] would be helpful. Seemingly, this is a common and important problem.
The TLD should be enough IMO.......None of these sites are interlinked, but Google should be aware that there is a relationship with the sites, due to the same ownership, same Google webmaster account , same domain name + seperate TLD's, same C block.
I think that, given this setup, Google has no choice but to assume that the "foreign" sites don't reflect a true international presence.
It's not that this kind of setup doesn't happen with legit companies (which are prone to use their own hosting) all the time. It's just that it looks so much like an obvious attempt at manipulation that Google would have difficulty differentiating such manipulation, if that's what it was, from a genuine international business arrangement.
Google therefore looks for other evidence of a real international presence. Because there's a language overlap, linking and hosting are the two elements they might look at.
[edited by: Robert_Charlton at 11:29 pm (utc) on May 16, 2007]
Between this thread and that other thread, tedster has implied, and Robert_Charlton and I have outlined, what we all seem to believe are the appropriate measures you can take. Unfortunately, they are not necessarily enough, and yet again, the "Do what makes sense for users" creedo doesn't apply. Instead, sites are resorting to adding and/or changing content to pages that don't need changing, solely to address as issue that G is having trouble sorting out.
However I'd expect Google to emphasize the .com domain, in the event of this kind of duplicate content problem, rather than one of the country domains.
There may be some historical footprints on the site's which don't rank, but I wouldn't have thought this level of sophistication would by applied by G to choose one site over the other.
[edited by: Whitey at 11:20 pm (utc) on May 18, 2007]
I imagine that G just sees two English language sites at different domains as dup's, try to pick one, and then the normal algo elements apply. Go to G UK and type in a common ecommerce term associated with large sites beginning with "A" and you'll see all .co.uk sites, so clearly G has the ability to "see" both the .com and .co.uk version and show the right version to the appropriate audience.
My guess, and again it's only a guess, is that G leans toward showing the site with the local TLD, as tedster said. Maybe sites with dominant link juice versus their twin can mess things up, especially if geo hosting is not lined up with the country reflected in the tld. The less authority and/or link juice a site has, the more sensitive it tends to be to certain aspects of the algos, and IMO that applies to how dup content gets treated in some respects.
You should try surfing via a proxy server. That's always interesting.
so we've focused on breaking the sites up via content addition, addresses, hosting locations, backlink differentation, etc.
Was this an existing site that you tidied up with history, or a new site, and did it do the job your client wanted in the end.
If it was a tidy up job, did Google's filter's respond quickly to the changes as it does with other duplicate content issues?
I wondered also if you were working wit the same domain brand name [ as we are ] e.g.
widget.com
widget.co.uk
widget.co.ca
[edited by: Whitey at 11:21 am (utc) on May 19, 2007]
The content inventory available in local search and results quality is therefore diminished badly for the user.
Then there are many risks associated with this unresolved fix. For example, if a siteowner puts up several regional subsidiary sites and strengthens them beyond the main site, they run a big risk of tanking the main site.
..... a real burning issue for G to both communicate and fix -IMO
[edited by: Whitey at 5:20 am (utc) on May 22, 2007]
My guess is that Google's algorithm doesn't accept that substantially similar content belongs in country localised searches on more than a few domains.
There may be something in that -- and by what method they determine what "substantially similar" means. Something about that measurement seems to have shifted. Are they possibly looking more at phrase co-occurence and less at simple text match?