Forum Moderators: Robert Charlton & goodroi
Each site works EXACTLY like the .com site. www.____.com/1234.html is the same as www.____.nl/1234.html as long as the language is set to the same setting.
and you can get the .nl to show all the languages like then english site...same as the .de, .fr, etc
[edited by: dlondon at 9:16 pm (utc) on Mar. 28, 2007]
If I understand you correctly, there should be zero problems. Search engines index text strings, not meanings. So when content is translated, it is no longer "duplicate" in any way that matters to web search.
The only thing I'm not clear about is what happens when the user who starts at example .nl chooses German. does the German content get served from example.nl (that WOULD be a duplicate issue) or does the user get sent to example.de after their choice.
If www.____.com/1234.html has the same content as www.____.nl/1234.html (in the same language), then yes, that is most definitely duplicate content.
go to example.nl and it shows dutch now scroll down and select english from the language chooser.
now go to example.com and do same thing. so now example.nl and example.com are the same.
that is what i meant.
[edited by: tedster at 1:01 am (utc) on Mar. 29, 2007]
[edit reason] use example.com [/edit]
Are you changing the language without changing the url? That could make things very tricky.because you can't use robots.txt. Perhaps you could include a meta robots noindex in the head whenever the page changes to any non-default language -- but I'd still be concerned that over time the entire domain would just vanish from search.
I think the best solution is to change the domain when a non-default language is selcted. Next best is to introduce language specific directories on each domain and exclude indexing on all but the default language pages with robots.txt