Welcome to WebmasterWorld Guest from 22.214.171.124
www.example.com (launched April '08)
fr.example.com (launched June '08)
de.example.com (launched June '08)
jp.example.com (launched June '08)
sg.example.com (launched June '08)
Content is localized dynamically depending on the browser's language setting.
But our traffic for www.site.com collapsed in Google after the country domains launched.
Is it possible that Google bot doesn't take the browser language setting in to account and just sees 5 duplicate sites?
Is there some way to test this?
[edited by: tedster at 8:51 pm (utc) on Sep. 5, 2008]
[edit reason] switch to example.com [/edit]
I would strongly suspect the type of content negotiation you are implementing.
1. What happens when googlebot requests a page?
2. Is there a natural click path to each language version, or is it all automatic redirects?
3. How much cross-linking is involved? Does each page link to its four counterparts?
1. Can this be viewed in Goo Analytics or in some other web log analysis?
2. There is a natural click path to fr.example.com from www.example.com (a prominent country flag)
3. Very little cross linking (apart from point 2) on the basis that english version and english version readers share so little in common with other languages.
I hope I have understood your questions. Thanks again and please feel free to ask more questions. I will respond immediately.
[edited by: tedster at 9:04 pm (utc) on Sep. 5, 2008]
I don't use GA very much but the accounts that I can see do not have a search engine bots report. As far as I know, you need your server server logs to see that.
My general recommendation is not to use automated language detection and forced redirects. First, when I travel to other countries, that type of site drives me wild with frustration. I think you're much better off allowing users to get the exact url that they asked for, and making the language choices they can make very clear on the page.
The technical "trick" of automated language detection can definitely backfire in many ways.
Here are my findings for the interest of others who find this topic:
Googlebot (not being a browser) does not reliably understand/process "Content Negotiation" and therefore thinks all these sites are identical (English version), and therefore may be subject to a duplicate content penalty and at the very least fewer pages indexed by Google. Technically, googlebot crawls without a preferred language setting and ignores "Accept-Language" which causes only the English page to be served when a specific URL is requested by Googlebot.
Short term solution:
. publish this tag <HTML lang="XX"> as part of every site. Replace the XX with the appropriate language code (fr, de, es etc.)
. have your server send this header: Content-Language: XX (again, replace the XX with the appropriate language)
[edited by: tedster at 7:40 pm (utc) on Sep. 15, 2008]