Forum Moderators: Robert Charlton & goodroi
Any ideas? Things like how much time a user spends on any given document etc.
Thanks
Mike
Google mentions traffic factors in many patents at this point. It's clear that they've got something to work with.
An obvious source of traffic data would be their occasional tracking of clicks on various search results. They certainly know the average number of clicks each position gets. So if a #1 falls well below the expected number, that's a bad signal. If a #2 results gets a very high number of clicks compared to the #1 that'savery positive signal... and so on.
If the same person comes back to the same set of results and rapidly clicks on another position, that is a kind of signal, although not a totally clear signal. And if the same person clicks on one result, then rapidly makes a revised query, there's another kind of signal.
My thinking is that pages and domains are rarely measured in isolation, whether it's traffic data or human editorial input. Rather, they are measured in relation to their position on a particular SERP.
Mod's note: Moved from another location. Not exactly asking the same question, but close enough that I thought it was worth bumping this thread up to see if there's additional discussion on the topic.
[edited by: Robert_Charlton at 12:58 am (utc) on April 24, 2008]