Welcome to WebmasterWorld Guest from 18.104.22.168
<a href="http://www.domain.tld/" class=l>Link text</a>
I must have missed something. So it's now known for a fact that G tracks serp links? Before, these were just speculations.
The top 10 for my industry changes from what i notice like 3 times a day with my site going from 1 to 3 giving other sites a chance.
They do measure the serps, they look at bounce rates via people clicking the back button and choosing a different result.
I'm getting a lot of back hits, but then additional hits, because Google offers multiple search results to my site. Hope this helps.
Google should be able to check the time differential between the click to SERP1 and SERP2, etc., and then assume value based on time spent on a site. (Except when surfer breaks for cup of coffee. ;)
They do measure the serps, they look at bounce rates via people clicking the back button and choosing a different result. They also look at IP information as well so they can "localize" results.
<a href="/url?sa=t&ct=res&cd=1&url=http%3A%2F%2Fdomainname.com%2Fpage.html&ei=VaRdSdSdZbU-QLH3ODhAg&usg=AFQjCNGcAFRqsJNadwApSkSkDlOWcsSnImnZA&sig2=utyK1qk9YJxTKcg9AtPwaA" class="l" onmousedown="return rwt(this,'','','res','1','AFQjCNGcAFRSdlSkfghFpBilLdiGbnImnZA','&sig2=utyK1qk9YJxTPKdlUdvtPwaA')">Anchor link text</a>
After I signed out of G Account, the SERPS are clean.
<a href="http://www.bupa.co.uk/" target=nw class=l onmousedown="return clk(this.href,'','','res','2','')">
but I also see the same code when I'm logged out.
I have to say sounds to me (so far) like pure speculation.
Trying to measure page usefulness through user behaviour would be so unreliable. The amount of time a user spends on a page is no indication of its quality. Depending on the market, the search term, and the user's ability to use an SE properly, there would be SO many different variables to consider.
This forum is filled day after day by people complaining that Google can't determine quality based on content and IBLs. That is so simple compared to determining quality based on user behaviour as to be laughable.
One site in particular with Analytics has had no link-building to speak of. It's a real 'if you build it they will come' site where all we've done is request a few links from forums and authority sites years ago and the site itself has done the work. We make tweaks often from a feedback form.
We have a growing and loyal core user group and over the past 18 months as this has grown so have our rankings. This would completely tie in with the findings from the experiment.
However, it also made me think that by starting to use Analytics on a high ranking site that has a high bounce rate (i.e. you suddenly let Google know that it is not performing well) could you actually harm its rankings?
What is an acceptable bounce rate?