I think it's result of this issue: [webmasterworld.com...]
I've been studying page 2 and page 3 for a while - and the rankings are incredibly volatile. It's just not like the "good old days" anymore. You used to watch a URL slowly climb (or fall) from one spot to the next, as the algorithm rated all the factors and changes.
But today, Google apparently runs statistical tests on user response to various types of results - especially in the deeper pages where impressions and clicks are very low. In fact, two computers side by side at the same location can get different results, especially in the deep pages and on long-tail query terms.
Google is trying to learn something from these changes, and I'm sure there are complex patterns that are hard for us to see. But there is one thing that we should learn. We can no longer take any meaning from position changes in the deeper pages of the search results, and it's not worth worrying about. Until your URL climbs pretty high - and how high seems to depend on how competitive the term is - unstable rankings seem to be the way it works today.
I think that Tedster is probably right. But all of this shifting could also be a way for Google to distribute what little traffic these lower-ranked pages get more equally.
There's probably no better example of how Google has changed and evolved. We started with once a month updates - you got your new position and there it was for 30 days. Then we went to fluid updates, all the time. But still it seemed that you had a position that sort of stuck and you could watch changes happen and know that either yuour competition changed something, or you changes something, or the algo changed.
But now, everything is so fluid that it's very hard to judge why any change happens at all. It sure isn't always links, or titles, or copy, or your competitors. The algorithm itself seems to act differently for different markets and even sometimes for different query terms in the same market.
No matter how Google is or isn't taking user data into account, it seems to me that a lot of a site owner's energy should be freed up to "chase their user rather than the algorithm." That actually feels better to me most of the time - except of course when I don't rank well ;)
Or throttle traffic to a site?
|But all of this shifting could also be a way for Google to distribute what little traffic these lower-ranked pages get more equally. |
A question for all
Is this volatility in lower ranking urls for both info and commercial queries?
Viewing some stats seems to show certain sites converging in their levels of G traffic
If this is a test of sorts, and I've assumed it has been, then the throttling makes sense to me. As to why the throttling, I've been thinking that among pages with not much else to choose, Google might want to assess the behavior of searchers among these pages, eg, and the throttling would serve to even out exposure and/or visitors enough to bring all data into the same range.
One thing I believe for sure is don't make changes to your site in response to lower ranking SERPS position changes. It seems G understands when a ranking change occurs and connects this in the future to changes on the page. If your rank changes, stick it out for a month and then, if things are still the same, make a single change and see what happens. Don't be reactive.
Off topic question: are there any stats showing that more a more users are searching deeper into the SERPS, i.e. page 2 and 3
Offhand no, only you seeing traffic to your own page 2 or 3 ranked pages. Indeed, the worst thing to do is freak out and run off into the dark.
|are there any stats showing that more a more users are searching deeper into the SERPS |
|But today, Google apparently runs statistical tests on user response to various types of results - especially in the deeper pages where impressions and clicks are very low. |
I think this is going to show where title tag writing skills are key.