smithaa02 - 2:02 pm on Sep 11, 2011 (gmt 0)
While it would not surprise me that the correlation between bounces/time on site relative to ranking changes was not positive but rather a spurious relationship, I can't think of a better source... More specific user actions (like printing/bookmarking) would which while probably measured not be large enough in small cases like mine and the author's to account exclusively for google's behavior bump.
I don't see other visits to other pages on site being a factor because my test case is a one page website (yet I receive all types of bounces/time spent one site in GA).
Visitors returning to your page is I believe a strong possibility (because if nothing else GA tracks this per keyword so we know google thinks this is very important data).
My best guess is that is that google's various tracking services are working together to give the big picture of what users/websites are doing. I suspect if a user goes from my site that uses GA to another site that uses GA, that they could for example calculate the time on site based on those two services. Other sources would be returning to the serps which obviously google tracks...going elsewhere while having the google toolbar or perhaps using another service like gmail (which may signify the user is done with your website).
The big question is how this is all usable for SEO. I do think like Tedster said this would not be easy to manipulate such as just spam clicking your page from the SERP's. Google probably detects what is a unique user based on IP and/or your google account...and probably has an organic filter that eliminates unnaturally high click spikes that statistically have too much volume.
I do think however that if one could pony up enough friends (for competitive terms this would probably have to be many thousands IMO) and I think in theory this could work. It might be interesting to test this, as in GA right next to each keyword it displays the recorded bounce rate and time on site for each search term so you might be able calibrate what google determine to be spam clicks based on this.
Probably the most practical solution however would be to make sure that A...users are clicking your link from the SERP's. If this is the case, meta descriptions no longer would have to be designed for googlebot but for humans and would have to have very catchy phrases ('free widget', 'top secret information','earth shattering exclusive report'...that type of thing) .
Once on the site you then need to manipulate users into staying longer than they normally would. I don't know if splash pages/glorified menu pages/teasers would be the key as I suspect page clicks aren't what google tracks anymore and that they have a way of calculating time on site regardless of this. Perhaps a online game section on your website which you stuff with free flash games to up that time spent on website? Freebies but with bureaucratic time consuming forms/processes to obtain? This is a toughy...
Definitely think it would be stupid for google to measure this type of thing as often I get what I want from a well designed website rightaway and then go back to google for a new search...while with bad designed websites I can spend a lot of time looking for my needle in the haystack and yet never find it...but in google's eyes I could perhaps be voting for this type of site.