Forum Moderators: Robert Charlton & goodroi
At the moment I'm thinking of having a directory for each language i.e. website/en (for English) etc. Would google consider it duplication of pages if the site has the same content in different languages under different subdirectories?
Thanks in advance.
What about having one site with a link to switch between languages, so one url would have different content (depending on the current language). How would google treat this kind of setup?
I now think it's best to have two domains, but not sure at this stage if I can get this foreign domain.
Thank you.
It's a part of the strategy to be in a market with local versions of the sites. But the sites have the same content (in it's own languages), structure and so on....
I'm linking between the languages, but I also get local links.
I'm at the top of the SERPS ;-)
Hope the can help you a bit
Another question that may be raised is whether the language metatag or the country specific domainname has more weight in the eyes of a search engine.
This is really an interesting topic on which I would like to hear more opinions, especially from those with some relevant experience.
The entrance page is created in the 3 languages of the web, auto-showing first the content in the http-accept language of the user if it fits to one of the web languages, if not, in English.
So if someone enters www.mysite.com or mysite.com he would get this, and robots normally logically get the English content first.
Inside the web is working with a frameset (navigation bars + content area/frame), each site shown up in the content-frame works even as an entrance-page (explaining the web, simple onside navigation etc.) if not shown in the frame-context and has links to the other language versions of this site.
Next I created sub-domains like [firstlanguage.mysite.com,...] [secondlanguage.mysite.com...] etc. pointing on the same physically directory on the server but showing up the content of the entrance page in one language only, so entering there a bot would get the content in the other languages too.
And finally I 'located' www.mysite.com and mysite.com by IP the zone-area of server's home country, and [english.mysite.com...] by IP in the zone-area of, for example, the USA and [suaheli.mysite.com...] by IP in the zone-area of, for example, the Swaziland …
Observations:
The main search engines (like for example google) spider www.mysite.com and take notice of the existence of the rest, and with some special search requests they also show up results from the sub-domains, but normally all is going by the main domain.
Less important search engines spider all (if I admit it) like it were unique domains and show up all, but nobody uses this search engines.
The site is listed well in all main search engines, reality is, that users entering from search engines come per 92% from google, 5% from yahoo, 1% from msn, 1% from other search engines.
So seeing the site listed the same manner in all search engines maybe that yahoo is number 2 and msn is number 3 in the search engine listings, but looking up a the absolute numbers they're producing they don't have any importance yet.
domain1/en/ - English version
domain1/translit/ - Transliteration
domain2/win1251/ - Russian encoding
domain2/koi8/ - Russian encoding
… possibly other encodings
Which brings me to another couple of questions. Would google consider the transliterated version of the website as some garbage as it really isn't a language? Or would it be aware of transliterated websites, what about adsense on these pages? And wouldn't different Russian encodings be considered duplicates?
Thanks.