| 2:23 pm on Sep 24, 2007 (gmt 0)|
A CC TLD will definitively assign geo-location to a site for search engines. For generic TLD's, the search engines rely on the IP address to determine geo-location. I really don't know much about language issues except that they're independent of geo-location, and its apparently assigned at the document level rather than the domain level. I know the server response header includes language designation and you can technically use a <meta> http-equiv tag to override the server response data or use a 'lang' attribute on the <html> tag, but I don't know for sure what the search engines rely on for this information, or the order of priority. That would be good to know.
| 10:05 am on Sep 25, 2007 (gmt 0)|
You seem to keep an eye on this area RB, have you come across anything "official" from G on the regionalising of results?
Only thing that springs to my mind was something along the lines of giving everyone the same user experience, and fair enough now there's less need to add Irish/Ireland to a generic search term to get some Irish results....
BTW, that quote on using CCTLDs is now "official" but in a different context:
Use TLDs: To help us serve the most appropriate version of a document, use top level domains whenever possible to handle country-specific content. We're more likely to know that .de indicates Germany-focused content, for instance, than /de or de.example.com.
Since different language versions are not seen as duplicate content I can't quite see where that suggestion fits in with content duplication.....
| 11:06 am on Sep 25, 2007 (gmt 0)|
This is my favorite topic and I would love to share my views.
To me the purpose of multi language sites can be two
1. Target the visitors of a particular country where people use a particular language.
2. Target visitors from all over the world those are using a particular language.
Example: Suppose I developed a spanish site.
My target might be
1. either target visitors from Spain
2. or target all the spanish speaking visitors in the world.
If the first target is there,then 3 things are very important for ranking
1. TLD or extension of the domain name.
2. Server hosting
3. Inbound links from sites those are hosted at that particular country and have country specific domain extension.
If the second target is there, then the above mentioned 3rd point is important.
Another thing I would like to mention,that is key word selection.
I have the experience of working on one other language site than english and that is spanish.That is < a gizmo > site.
Before me,another person worked on it.The targeted keywords were spanish gizmos, spanish free gizmos etc.
What I have done, I have done a research work to find out what spanish people will type if they are looking for "gizmos".
I found that they will type same keywords that we had targeted but in spanish language.
I just change the targeted keywords. Then the targeted keywords are spanish ecards in spanish language.
The result was excellent and increase in traffic is too good.
Now thats all what I feel,don't know whether this is worthy or not. :)
[edited by: tedster at 6:30 pm (utc) on Sep. 25, 2007]
[edit reason] change specific keywords [/edit]
| 3:48 pm on Sep 25, 2007 (gmt 0)|
I had to check into geo-location for a client about 2 or 3 years ago. It took me several hours of scouring the online docs for both the US and non-US versions of their websites to find any information at all about geo-location. In recent months, I've tried to find all of that documentation again because of all of the online discussions I've been involved in, and I've been unable to find any definitive statements from any of the search engines. At the moment, the only public declaration I know of is from Google where they address the issue from a user's standpoint in explaining country-specific searches, which mentions the CC TLD factor and indirectly refer to the IP address factor. Google, for example, used to have advice for webmasters that said they would sometimes refer to the domain name registration data. If that advice is still out there, I can't find it (and I've never seen any evidence of their ever using it, anyway). So I've been relying on what I had found in the past and what I've experienced in the meantime.
The search engines behavior in this area has remained unchanged in at least 2 years. They check for a CC TLD first, and failing that refer to the IP address of the server. It is easy to see why they chose these methods. Its dirt simple to implement because it doesn't require a lot of ongoing processing, and reasonably reliable in terms of the Web overall. Routine updates of the generic TLDs' IP addresses is all that's required. No convoluted analysis of <meta> tags, content, or links is involved. Unfortunately, the search engines' policies are essentially invisible to users. I suspect that they don't document it well because its wrapped up in the ranking algorithm. They won't give away any part of the secret sauce recipe even when its obviously just 1000 Island Dressing. And by keeping it a secret, it bites small websites who rely on inexpensive hosting options in the US who also happen to have chosen a .com or .net domain name without any idea of the impact it may have.
But, language is a separate issue that I've been curious about, but have yet to investigate. Here again, the search engines don't seem to explicitly outline their methods for determining language. It would be interesting to try a few language-restricted searches and analyze the pages that appear in the results. You'd have to record at least the server response header, the <meta> tags, and 'lang' attributes and see which ones seem to work.
| 12:22 am on Sep 26, 2007 (gmt 0)|
We are planning on segmenting some of our news content for different countries. What is the best approach for this?
| 2:24 am on Sep 26, 2007 (gmt 0)|
In my opinion, the best answer depends on your plans and how much work you're willing to do. The domain names with the Country Code Top Level Domain Names (.co.uk, .de, etc.) would have several ranking advantages in their target countries, but new domains would be prone to poor rankings in Google until they have existed long enough to earn some trust from Google. Its possible to get new domains to rank well by getting some especially strong links from well-ranking sites, but its not easy. So, over the short term, subdirectories in an existing well-ranked domain would probably tend to rank better. You might be able to cleverly try both methods at the same time by putting your best content in the subdirectories while you let the new domains with the Country Code Top Level Domain Names mature.
| 10:13 am on Sep 26, 2007 (gmt 0)|
What's new in all this is that for generic terms G is boosting "local" pages in the "web search" results and it now depends mostly on the strength of the local competition how well "foreign" pages will feature.
For competitive terms, in English language Gs we'll see far fewer sites/pages being able to achieve a "global" top ten, and in non-English language Gs far fewer "foreign" multi-language sites/pages, IMO.
Seems to me G is pushing the use of CCTLDs without giving us the main reason, the increasing regionalisation of "web search" results based on TLD/IP....
| 2:23 pm on Sep 26, 2007 (gmt 0)|
Its not just Google. Yahoo! and MSN both give geo-location a great deal of weight in their rankings for all web searches.
| 11:36 am on Sep 29, 2007 (gmt 0)|
On a related matter, "local" links.....
It's often suggested that local links may help "foreign" pages rank in the regional Gs, but as it's about the only thing we could do to help matters, this may just be wishful thinking.
If local links gave a boost I'd expect to see more "foreign" pages ranking higher in regional "web search" results than their allinanchor positions would suggest they should.
Anyone seen this?
| 1:59 pm on Sep 29, 2007 (gmt 0)|
I'm not sure what people mean by "local" links or what influence they might have on rankings. All I can say is that I've seen no evidence that links have any effect on geo-location. Links naturally do have an impact on rankings for search terms that include a location name, whether via anchor text or through Google's lexical analysis of the originating document. It only makes sense. Geo-location is an absolute attribute. Each site or domain has a single geo-location. That is, there's no scale of relevance score for geo-location that might be increased through links.
I'm not well-informed where "local" issues are concerned, so I apologize if I'm off on a tangent here. The search engines are doing so much with local search that I haven't been able to keep up with them all. My impression is that "local" issues primarily involve cities or localities, but not county of origin unless through the factors I mentioned before.
| 2:27 pm on Sep 29, 2007 (gmt 0)|
I should perhaps have used "regional" rather than "local"...
The theory being postulated is links from Irish sites would help a US hosted .com feature higher in the G.ie "web search" results than it otherwise would.
If anyone has any examples.....
| 3:28 pm on Sep 29, 2007 (gmt 0)|
Probably the best thing is to go to some of the regional Googles and look closely at how they work.
For one thing, they give you the option of specifying the language OR the specific country. For another, even the global set of results is likely to be quite different depending on the surfer's default language (based on the "hl" parameter in Google's URL.
A data point: I went to Google.es and searched for my own site while specifying "páginas en español", and got 50 results. Google.fr shows 300 "pages francophones". These are pages hosted in the US with a meta tag specifying the content language as English. However the specific pages that come up DO have some Spanish or French on the page, so it seems as if the language filter isn't that restrictive.
| 3:25 am on Sep 30, 2007 (gmt 0)|
If you want to see a few good tests of how a local Ip can influence Google's results they can see it from here [sergioblanco.org] (Spanish) <* see note >.
Recognized Spanish SEO has tested changing his company of hosting with Ip into USA into one with Ip into Spain.
The results for the searches in Google.es and Google.com have been very significant.
From 35 to 11 (in google.com: #2)
From 84 to 7 (in google.com: #3)
From 65 to 10 (in google.com: #12)
From 72 to 21 (in google.com: #1)
< * moderator note: the Charter [webmasterworld.com] only allows links to sites that are
authoritative. In this case, because the content of that study is useful
for our discussion, we're making an exception to the normal policy. >
[edited by: tedster at 5:35 pm (utc) on Oct. 2, 2007]
| 9:54 am on Sep 30, 2007 (gmt 0)|
If I read that correctly his sites were .coms/.orgs so no great surprise there.
He mentions benefits in the "web search" results, any idea if it was similar in "Paginas en Espanol"?
I'd have thought PeE results would be less regionally skewed...
| 10:28 am on Sep 30, 2007 (gmt 0)|
In Google.es without selecting the option "paginas es espanol"
In Google.com without selecting the option "paginas es espanol"
The sites are .com and .org
The keywords 2 and 3 are of the same web.
| 11:00 am on Sep 30, 2007 (gmt 0)|
Matter of interest Errioxa, of the three search options on G.es which is the most popular?
| 11:11 am on Sep 30, 2007 (gmt 0)|
"la web" is the most popular, default.
Very few people use other two options