Forum Moderators: Robert Charlton & goodroi
When I check inbound links sometimes sites with very few inbounds are near the top of the serps above pages that have many more inbounds. I realize page on considerations could be part of this but the difference was so great.
It hit me that maybe the cause is a new factor going on. There has been talk in this forum for a while that Google might start considering the time spent on the page. I'm not sure they can do tha if the visitor goes on without returning to Google. But they certainly would have data on how quickly the searcher returns to the serps to try another result.
A simpler possibility could be that Google is recording whether people click on a given site or not. If a site is in the top 10 and doesn't get the clicks that would be expected in that position could Google be pushing the page down in the results?
Now it is true, just because some factor is in a patent, that doesn't mean that it's really in use. But in this case, I think clickthrough is in play to some degree. Since the snippet is an important factor in getting the click, I try to write encticing meta descriptions for just this reason.
As a url find its way onto the first page of results, it needs to perform to a higher standard. I've often noticed that result #11 can seem to be a devil to move onto the first page - as though there's some kind of extra hurdle in place beyond moving up one spot.
#11 can seem to be a devil to move onto the first page - as though there's some kind of extra hurdle in place beyond moving up one spot.
I've wondered if being "at the top" of the second page (#11) can be better than #10. As I scan results if there's nothing particularly enticing at #10, I'll 'flip' the page (when my preferences are set to 10/page, which isn't often).
To the TS, I doubt Google would do much with bounce rate/CTR on its own unless it's unusually extreme. I don't recall Cutts ever discussing it at length.
He might have said it's a noisy signal, which would indicate it would have to be set against a lot of other data, like "the final straw" to trip a filter after various other red flags.
When you think about it, what SERP results text makes one result get better attention than others? Strong titles, nice descriptions, good-looking domain? Is that why Google wants to use CTR in its algo?
Seriously, I really don't know how much that data sways searchers one way or the other. I doubt much, but I'd like to see research into this.
Bounce rate would likely be a more significant piece of data than CTR, for obvious reasons--the user rejecting a page after seeing it v. guessing its value based on SERP info.
Google Groups has message rating. I'd like to see this on SERPS, too. It would affect CTR, obviously, and spare us from SPAM sites, just the same way useless/offensive usenet posts are avoided by Google's very simple existing system.
I wouldn't necessarily agree with the idea of moving a site up the SERPs if it got five stars; but I do think there's nothing inherently worse in rating sites v. messages.
So many sites already let you rate news articles and many directories let you rate other sites. If Google is smart enough to implement the idea with abuse prevention engineering, it should make it happen as soon as possible.
Combing the data streams of user ratings with bounce rate could certainly provide useful SERP algo data for superior results.
p/g
For certain sites which ranked high for completely off-topic ( accidental ) associations, I sensed two kind of movement in the SERPs:
It they had any clicks, nothing special happened.
If their CTR was near 0% for like a month or two ( 0% means 0 clicks btw ) they moved down the ladder. Not far though, they usually stopped on the next page.
...
The reason why I say I can't back these things up is that there had been other things which could have caused them to rank 'higher than they should have' in the first place; the honeymoon period, indexing of all pages on site not yet done, still fresh link profile... ( a still positive trend ) etc.
But if I look at CTR *only*, then near 0% seemed like a prior notice for ranking drops.
Could be engineers finetuning relevancy calculations for problematic SERPs though, for it always took months.
Besides, a listing has to be really irrelevant and/or seldom displayed to get 0% CTR on the first page, let alone at #1, right?
Lemme give you an example:
( this really happened. Uhm, I mean *happening* )
'Widget specifications' returned a 'services and facilities' page for a listing on an accommodation site at #1. That's right, this hotel had widgets at your service. I was really proud of its success *smirk*
Some clicks, some preload robot activity... and it remained there for months. Then once it slid to #2, and the automatic inflation of CTR stopped, it slid to the second page within a week.
Oh, btw, it's still there. ( #14 I think? )