brinked - 10:29 pm on Dec 5, 2010 (gmt 0)
scottsonline, I completely agree that it can be disastrous. I am not talking about making it a major signal. I am talking about more on the lines of a system where over 90% of searchers that click on the result and then end up on another result, that should raise a flag to google that this site is perhaps not suited for the placement it has so it can maybe do a manual review to see how they can "algorithmically" correct the problem, or maybe move it down a few places to see if another site can prove to be more helpful. I know it raises a lot of alarms and people would be very weary to support this, but I am talking about something that would be very advanced where it would not be as simple as just a back click, and even then it would not be a major ranking factor, but this can bring to googles attention some websites that should not be ranking as high as they are. A great example is if I am searching for widgets and I see a website about medicine, the clickbacks would be very high and this should send a signal to google.
Google does have measures in place to put a stop to duplicate sites but like anything else, people find a way around it. I am actually dealing with this issue right now, this low quality site has 3 other almost identical copies ranking in the top 20 for all related search terms.