Welcome to WebmasterWorld Guest from 18.104.22.168
Only thing that springs to my mind was something along the lines of giving everyone the same user experience, and fair enough now there's less need to add Irish/Ireland to a generic search term to get some Irish results....
BTW, that quote on using CCTLDs is now "official" but in a different context:
Use TLDs: To help us serve the most appropriate version of a document, use top level domains whenever possible to handle country-specific content. We're more likely to know that .de indicates Germany-focused content, for instance, than /de or de.example.com.
Since different language versions are not seen as duplicate content I can't quite see where that suggestion fits in with content duplication.....
To me the purpose of multi language sites can be two
1. Target the visitors of a particular country where people use a particular language.
2. Target visitors from all over the world those are using a particular language.
Example: Suppose I developed a spanish site.
My target might be
1. either target visitors from Spain
2. or target all the spanish speaking visitors in the world.
If the first target is there,then 3 things are very important for ranking
1. TLD or extension of the domain name.
2. Server hosting
3. Inbound links from sites those are hosted at that particular country and have country specific domain extension.
If the second target is there, then the above mentioned 3rd point is important.
Another thing I would like to mention,that is key word selection.
I have the experience of working on one other language site than english and that is spanish.That is < a gizmo > site.
Before me,another person worked on it.The targeted keywords were spanish gizmos, spanish free gizmos etc.
What I have done, I have done a research work to find out what spanish people will type if they are looking for "gizmos".
I found that they will type same keywords that we had targeted but in spanish language.
I just change the targeted keywords. Then the targeted keywords are spanish ecards in spanish language.
The result was excellent and increase in traffic is too good.
Now thats all what I feel,don't know whether this is worthy or not. :)
[edited by: tedster at 6:30 pm (utc) on Sep. 25, 2007]
[edit reason] change specific keywords [/edit]
The search engines behavior in this area has remained unchanged in at least 2 years. They check for a CC TLD first, and failing that refer to the IP address of the server. It is easy to see why they chose these methods. Its dirt simple to implement because it doesn't require a lot of ongoing processing, and reasonably reliable in terms of the Web overall. Routine updates of the generic TLDs' IP addresses is all that's required. No convoluted analysis of <meta> tags, content, or links is involved. Unfortunately, the search engines' policies are essentially invisible to users. I suspect that they don't document it well because its wrapped up in the ranking algorithm. They won't give away any part of the secret sauce recipe even when its obviously just 1000 Island Dressing. And by keeping it a secret, it bites small websites who rely on inexpensive hosting options in the US who also happen to have chosen a .com or .net domain name without any idea of the impact it may have.
But, language is a separate issue that I've been curious about, but have yet to investigate. Here again, the search engines don't seem to explicitly outline their methods for determining language. It would be interesting to try a few language-restricted searches and analyze the pages that appear in the results. You'd have to record at least the server response header, the <meta> tags, and 'lang' attributes and see which ones seem to work.
For competitive terms, in English language Gs we'll see far fewer sites/pages being able to achieve a "global" top ten, and in non-English language Gs far fewer "foreign" multi-language sites/pages, IMO.
Seems to me G is pushing the use of CCTLDs without giving us the main reason, the increasing regionalisation of "web search" results based on TLD/IP....
It's often suggested that local links may help "foreign" pages rank in the regional Gs, but as it's about the only thing we could do to help matters, this may just be wishful thinking.
If local links gave a boost I'd expect to see more "foreign" pages ranking higher in regional "web search" results than their allinanchor positions would suggest they should.
Anyone seen this?
I'm not well-informed where "local" issues are concerned, so I apologize if I'm off on a tangent here. The search engines are doing so much with local search that I haven't been able to keep up with them all. My impression is that "local" issues primarily involve cities or localities, but not county of origin unless through the factors I mentioned before.
For one thing, they give you the option of specifying the language OR the specific country. For another, even the global set of results is likely to be quite different depending on the surfer's default language (based on the "hl" parameter in Google's URL.
A data point: I went to Google.es and searched for my own site while specifying "páginas en español", and got 50 results. Google.fr shows 300 "pages francophones". These are pages hosted in the US with a meta tag specifying the content language as English. However the specific pages that come up DO have some Spanish or French on the page, so it seems as if the language filter isn't that restrictive.
Recognized Spanish SEO has tested changing his company of hosting with Ip into USA into one with Ip into Spain.
The results for the searches in Google.es and Google.com have been very significant.
From 35 to 11 (in google.com: #2)
From 84 to 7 (in google.com: #3)
From 65 to 10 (in google.com: #12)
From 72 to 21 (in google.com: #1)
< * moderator note: the Charter [webmasterworld.com] only allows links to sites that are
authoritative. In this case, because the content of that study is useful
for our discussion, we're making an exception to the normal policy. >
[edited by: tedster at 5:35 pm (utc) on Oct. 2, 2007]