| 9:30 pm on Mar 12, 2010 (gmt 0)|
There's one challenges that a website might run into that wasn't addressed in the article -- the difference between targeting a language and targeting a region or country, and how those two goals can get tangled.
I know of websites that have done a good job here by using their URL structure to include two signals at the directory level, one for the language and one for the country.
So for instance, example.com/de/ch/page.html would indicate German language content targeted to Switzerland, and example.com/en/za/page.html would mean English language content targeted to South Africa.
Webmaster Tools allows the site owner to create separate geo-targeting for each subdomain (something that many webmasters do not yet appreciate) so this dual directory approach can often be a help in keeping things straight. As Mueller points out in the opening sentence, "...a majority of users surveyed feel that having information in their own language [is] more important than a low price."
| 10:44 pm on Mar 12, 2010 (gmt 0)|
This is a complex subject and i appreciate JohnMu raising it in such detail.
I've wondered why Google restricts the geo targeting of specific URL's to one additional country and doesnt consider opening this up.
Maybe some day Google will consider displaying a ccTLD in the global results with more systematic relability in conjunction with webmasters than it currently does. It seems a bit hit and miss at the moment.
Should Google open this up via WMT and allow webmasters much more flexibility in providing signals to Google via this panel , and/or provide tag recognition to signal the territories to be targeted? Managing dupicate content would seem to be a lot easier with this.
| 7:28 am on Mar 14, 2010 (gmt 0)|
What I do not understand is where do we specifically set geotargeting on Webmaster Tools for URLs like site.com/de/ mentioned in the article?
I can only see where to change the geotargeting for the entire domain.
| 1:57 pm on Mar 14, 2010 (gmt 0)|
I believe that this is done by making the subfolder a virtual site - with it's own verification code. You can then set the geotarget for the virtual site.
| 8:26 pm on Mar 14, 2010 (gmt 0)|
I think easy html standards for webmasters are already in place designed to give all search engines good indicators when to serve particular page content as regional.
The www.w3.org recommends that each regional page should include in the head section of the html code following META Tags if you want to serve your content for example in Germany:
<META name="keywords" lang="de" content="auto, automagazin, automarkt">
<META name="distribution" content="local">
<META http-equiv="Content-Language" content="de">
Plus you can use Google accepted and recommended:
<META name="google" content="notranslate">
<link rel="canonical" href="http://www.website.com/de/”>
All SE should follow www.w3.org standards and clearly indicate that to webmasters so we would not have problems like this.
| 8:29 pm on Mar 14, 2010 (gmt 0)|
Along those lines, I was surprised by the following comment in Mueller's article:
|...we do not use locational meta tags (like "geo.position" or "distribution") or HTML attributes for geotargeting. While these may be useful in other regards, we've found that they are generally not reliable enough to use for geotargeting. |
| 11:35 pm on Mar 14, 2010 (gmt 0)|
META Tags are accepted web standard and used as a basic head element of every page are a helpful element in the creation of long lasting, searchable and organized web documents structure.
Personally I would not risk for long term web page not having region, language, author, and copyright indicating META Tags.
| 1:50 am on Mar 15, 2010 (gmt 0)|
So why doesn't Google consider them?
Even if they are not reliably used , like anything else on the web , they would be a further " signal " , surely no less than anything else out there.
| 5:41 am on Mar 15, 2010 (gmt 0)|
The way I understand it, Google monitors just about every signal we can imagine (and some we can't). That's why they can say whether a signal is "reliable" or "noisy" or whatever.
So in that sense, they do consider everything -- but it only makes sense to build out automated infrastructure that supports a particular signal when it shows a statistical degree of dependability.
In their relevance algo, for example, this process is responsible for the shifting values of the H1 element. So I imagine they will continue to monitor the location meta tags, and if they show more dependability in the future, then they would get folded into the brew.
| 7:57 pm on Mar 15, 2010 (gmt 0)|
This is interesting to me because I've been doing some translations of my site into Japanese and Thai. It's a very expensive endeavour, so it's important to get the technicalities right or you'll be flushing half of your money down the toilet (or worse). I've got server headers giving the right content-language and also the declarations at the top of the source set up appropriately. The way I'm going about it is a .jp TLD for the Japanese version etc and interlinking the individual pages to the different versions... my main worry however is an interlinking penalty, even though this is a legitimate use of multiple domains.