potentialgeek - 2:26 am on Sep 30, 2012 (gmt 0)
I find the longer the page, the longer the average time on page.
I agree the signal can be noisy, but it could still be possible to get some decent relative data.
And any click back that happens within 1 second, consistently, isn't a good sign, but probably is a valid signal.
If I were programming the algo, I'd set the signal based on:
Avg. Time on Page/Page Length
You'd want to estimate the percentage of the page the user read. I'm assuming most users read an entire page if it's quality from top to bottom. I bounce back to Google as soon as I see junk/lose trust in the page.
I think you can cut down on the noise of the signal based on the ratio of time on page to page length. That way those sites that are concise aren't penalized for being concise.
I'm sure if Google collected enough data for any set of results for any keyword, it could find a range of bounce averages and anything outside that could be considered a red flag, or subject to more scrutiny.