Welcome to WebmasterWorld Guest from 220.127.116.11
We own a significant information portal in a niche academic area. It does have some worldwide appeal, but by far and away, it is focused upon the UK perspective.
The content is UK centric, the words are UK English and the contributing staff/students are all based in the UK.
It appears very well indeed in Google.Com, as it should given its prominence and content, but does not appear at all in Google.Co.Uk! And to make matters worse of course, many surfers in the UK are being automatically diverted from Google.Com to Google.Co.Uk (even though they don't want that).
The site itself is hosted in Canada, but the WHOIS is clearly UK. Naturally, it is a .org domain.
So what is going on here? Why can't the people who most want to see the content actually see it? Is this a Google bug? Or a 'feature'?
More importantly, does anyone have any experience here on what can be done to correct it? Or is it just a case of writing Google off as a source of sensible traffic? We don't lose money by doing that, but it would certainly inconvenience those seeking our site from this country (it is a well known site and many people search on the site name, and even for that search term it doesn't appear on .Co.Uk!).
Any advice would be welcome.
Google weight each keyword such that rarer words score higher than more common words. The weighting can be as simple as the reciprocal of the total number of that particular keyword in the index.
Each language will have a different weighting dictionary. Thus using German keywords, but forcing G. to use its UK dictionary will cause those sites to jump up the SERPS because German words are rare in the English Language.
The problem that Philosophy5 has is that Google has no way of knowing that it is a UK site. People have already ascertained that G does not use the WHOIS info, probably because it can be faked. They cannot, in this case, use language as a clue, Canadian English is closer to British English than it is to American English.
Google needs a clue, and that clue must come from the server, I guess, otherwise G has to fall back on I.P. location.
I have seen sites with location (latitude and long) in their meta tags in the past, and have often wondered why. Perhaps something somewhere reads it, bit I doubt if Google could at the moment.
You sound like a really nice guy Highman: I'm glad we will never need your help and I'm glad that the others here aren't so rude. Some advice for you - if you don't like a question go play somewhere else, preferably on the motorway.
"Your site should be a .org.uk if you are targetting the UK"
We have been online almost as long as Google has, and this has always been our domain name, and always will be. Someone else has the .org.uk in any case.
"Google needs a clue"
How about the metatag as mentioned above? Or better still, how about defaulting Google.co.uk to the full database, giving people the OPTION of a subset search?
The people I've told about this have been amazed that they are only seeing a subset of websites when they search here. Some are re-appraising whether to use Google at all.
"Just host in the UK and change the nameservers."
We have actually decided to take the other route, and block Google completely by using robots.txt. We may look at it again a couple of years downstream, but for now we just don't need the 'foreign' traffic they send over.
This seems the simplest solution, as we are not prepared to uproot or make fundamental changes for one seach engine's flawed (in our opinion) approach.
We are comfortable with this now, but sincere thanks to everyone who made positive suggestions.
Yeh i'm all sweetness and light ;-)
I like the question - not sure you like the answers, sorry I'm straight to the point but if you wanna fix it...
The way to do it has been pointed out - to simply cancel all google traffic seems pure madness - but then there is obv. no commercial side to your site, even so....
Fully agree with you on the advice if people want to show up on "uk only" searches and also agree that most people will just search the "web" and not check uk.
But - lot's of uk hosted, uk focused sites are being filtered from "THE WEB" searches on google.co.uk - as I first posted in Feb.
Simply saying go host in the uk or get a uk domain isn't the answer to the issues I am seeing - as described above.
Yet somehow WE are wrong? Somehow WE have to take difficult steps to rectify something? I don't think so!
We were prepared to make minor adjustments, or apply a quick and simple fix, but it is obviously not that simple.
Hence our decision just to move on, and cut out Google to reduce wastage of bandwidth and peoples time. I would suggest that we are far from unique.
At least we have a final outcome, which is all I really wanted (so we have found the "wisdom" we needed, Judge).
But people are failing to recognise that uk hosted sites are NOT appearing on a google.co.uk "the web" searches.
The fact GoogleGuy has been so quiet on this subject, now and back in Feb, makes we think this has something to do with one of their numerous filters that they call their algorithm...
We have actually decided to take the other route, and block Google completely by using robots.txt.
It was only 3 days. Google is--most of the time--very reasonable in the number of pages that it takes. My experience from a plethora of url-only listings [site:my-site.com] also suggests that it takes a minimum of 3 bot-visits before there is a change (see msg#60 [webmasterworld.com]). You would have had to plot specific URLs against the bot visiting those URLs and the position in the SERPs. Time-consuming, but valuable.
Having said all of the above, I fully understand the utter frustration at what seems like an intractable, stupid action from an all-powerful, irresponsible company.
I fully understand the utter frustration at what seems like an intractable, stupid action from an all-powerful, irresponsible company
We entered the debate from the stand point of perhaps a very minor change, fully expecting to discover some error on our part. We quickly learned that we had done nothing wrong at all, and that this search engine had some technical issues to overcome: it is hardly intelligent to cut out a site as influential as ours from its core constituency based upon the location of its server: bearing in min dthat hosting is a GLOBAL marketplace.
We also quickly understood the 'traffic chase' on here, of which we are not part. If our actions seem strange, perhaps the reality is that we are not seeking search engine traffic as keenly as others seem to be. If a search engine doesn't present our site, that really is a problem for that search engine. I am sure we are in good company, and far from unique.
Also, banning this search engine via robots.txt doesn't actually change our position at all. We are essentially banned in any case, from our constituency. All we are doing is 'self-banning' from the rest, which is inappropriate anyway. Net result: no real change.
Finally, sometimes people can be too close to situations. I think perhaps that Google may be too close to its technology to actually see that normal people want to search the whole WWW when they search the WWW.
They don't want an unreliable ill defined 'filter' in place, (incorrectly) restricting what they are able to see. Why this filter cannot be just an option, which I doubt anyone would ever use, is beyond me. But there we are, back to Alex's point above I suppose.
Google is perfectly free to act in this bewilidering and peculiar manner, and we are perfectly free to walk away, which is what we have done.
4. I'd like my site to return for pages from a specific country.
While all sites in our index return for searches restricted to "the web," we draw on a relevant subset of sites for each country restrict. Our crawlers may identify the country for a site by factors such as the physical location at which the site is hosted, the site's IP address, the WHOIS information for a domain, and its top-level domain.
That said, your site's top-level domain doesn't need to match the country domain for which you'd like it to return. It's also important to keep in mind that our crawlers don't index duplicate content, so creating identical sites at several domains will likely not result in their returning for many country restricts. If you do create duplicate domains, we suggest using a robots.txt file to block our crawler from accessing all but your preferred one.