Duplicate content overload, a well known .uk review site at #1, then at #8,9, and 10 their .com and .in domains with the same content...
courier - I'd like to avoid opening this up to the discussion of a specific domain or keywords, but the site you're describing sounds very much like TripAdvisor, and we did discuss what appeared to be multiple duplicate pages of TA ranking here....
How Google is Showing the Results http://www.webmasterworld.com/google/4381552.htm [webmasterworld.com]
As I mentioned in that thread, most of the pages that were ranking were different paginated reviews, images, etc, and, as such, they were not duplicates. Google for the most part appeared to be showing different pages on the different ccTLDs.
I've since noticed, though, that, on some searches, several of the results are the main TA listing pages for a particular query, which essentially are identical on different cctlds. So yes, Google is ranking some dupe content on different ccTLDs, but not quite as much as it might appears if you just look at the brand name. This is consistent with how Google handles ccTLDs. When the inbound links are sufficiently localized and independent, and other conditions are satisfied, Google tries to allow cctlds to appear in the serps. I haven't checked the details of the TA sites in terms of hosting, localization on the pages, etc. PageRank and authority are likely also factors.
Does google have some kind of threshold for duplicate content. Some is ok, but more is not?
I've noticed over the years... even way before the Vince Algo update [
webmasterworld.com...] which many describe as being all about branding... that if a site had extremely good inbounds, Google would allow significantly more internal duplication than if a site was badly linked. I've never seen highly templated pages in a geo-directory type site, eg, cause nearly the same problem in a high PageRank, high authority site as they do in a low PR, low authority site. I can't give you percentages, as there are a lot of variables.
While I don't have a good before and after Panda comparison to say if this treatment of internal duping has changed, my guess is that Google did not intend for Panda to reduce rankings for sites with high "trust, reputation, authority, and PageRank," which is how, during discussions about branding, Matt Cutts characterized pages that Google wanted to rank.