Forum Moderators: Robert Charlton & goodroi
What is the standard protocol for this? Would this be seen/understood as duplicate content? And if so, should I exclude the foreign language sections with robots.txt? Obviously I would rather not do this.
My preferred approach is to create a subdirectory for each language, using its standard 2-letter abbreviation - /en/ and /de/ and /es/ and so on.
One warning that several people have reported -- whether you use subdirectories or subdomains or even full blown country-specific TLDs for each languag, do not cross link every page to all its translated versions. Keep the cross linking to a high level only.
what i mean by the unique url: www.mysitename.fr or www.mysitename.nl
i am not in fav of subdomains.. no one really remembers them. also a url gives more authority from a customer point of view.
Good luck
do not cross link every page to all its translated versions. Keep the cross linking to a high level only.
Why do you say this tedster? I've done this and haven't found any negativity through doing it. I think it's more user friendly too.
I also understood that a link tag could be used to inform bots, i.e:
<link rel="alternate" type="text/html" href="/version-in-french.htm" hreflang="en" lang="fr" title="Je ne sais quoi" />