Msg#: 4260631 posted 4:55 pm on Jan 31, 2011 (gmt 0)
This is a bit of an old question from and old timer but I'm wondering if anything has changed...
I used to run different versions of my site, keeping .co.uk and .com separate. I don't remember why but some time ago I merged the two together into .com and show different things to visitors depending on their I.P.
As I understand it, Google will mostly (only?) look to see what's on the U.S. I.P. version.
For the most part that's not a big deal but there are advantages to my getting them spidered individually again such as showing the best selling products by country. Also, there are some language issues. We've recently launched a gift voucher but that's gift 'certificate' in the U.S. As it stands I can't optimise for both so I'm limited to one (the U.S.).
If I do split the sites, how different do they need to be for Google not to mark them down as duplicates and bar one of them?
Msg#: 4260631 posted 6:08 pm on Jan 31, 2011 (gmt 0)
From what I know, the two sites don't "need" to be different at all. Google's intention is to return .co.uk results to their users in the UK and .com results to their users elsewhere. It's still a good idea to make sure the .co.uk site is "localized" in spelling, idiom, currency and so on.
Msg#: 4260631 posted 6:42 pm on Jan 31, 2011 (gmt 0)
I have experience with a co.uk site that is identical to the .com site (same URIs, same content) with the exception that some words have been spelled with the preferred British spelling. For the first few years, the co.uk site didn't get much traction. UK visitors (by IP address and from google.co.uk) tended to get sent to the .com site. But then in early 2009 Google appeared to change their algorithm. Now the appropriate site shows up in the SERPs for the appropriate users. The UK site gets a substantial amount of traffic and UK visitors feel more at home.
Msg#: 4260631 posted 1:10 pm on Feb 1, 2011 (gmt 0)
just remember that google algo changes alot.
im a bit paranoid. i also think its best to make thins as simple as possible for googlebot. i would work up a back-up plan that introduces unique content and code. this is likely overkill but it could be helpful in case the algo changes or there is a glitch with duplicate content filters.