|Objectively Measuring Quality of Google SERP Results|
Been frustrated at the quality of results, like many, so was curious to see whether visitors to my site sent via google search were finding me after going through a number of pages of results.
It looks to me like I'm getting more and more situations where I'm being found on page 5-10, suggesting that people are not finding what they want in the earlier pages.
Wondering if there's some way to use that information to get a general trend of what searchers are doing, and whether they find the first pages of results worse.
You cannot measure it, not even truly Google can since many searches are brand new and can't measure say people's reaction. Google can have their own criteria and show G as #1 in quality, Bing can have their own criteria and show Bing #1.
Then SERPs change every half an hour or whatever. The idea probably is that between ads and organic searches users find what they're looking for, more or less.
Furthermore... what appears on page 5 to you, may show up on page 2 to me today, or page 10 tomorrow... with personalized search the way it is.
I really don't think there is any way to measure efforts in Google SERP's anymore. Instead, we try to focus on how to form a LASTING relationship with each visitor they DO send! :-)
Here's a goal we strive for every day... Convince people to come to us FIRST next time, instead of finding us on Google or any other search engine.
Old skool crap like -
- Bookmark our site
- Bookmark this page
- Personalized welcome messages
- Share on Facebook or ANY other social network
- Join our free newsletter
- Subscribe to updates on this article
- etc etc
We treat EVERY visitor from Google as if they are the last visitor from Google.
|Furthermore... what appears on page 5 to you, may show up on page 2 to me today, or page 10 tomorrow... with personalized search the way it is. |
Golly, don't you have access to the link within the google serp that the visitor clicked on in terms of rank in whatever analytics you use?
You don't need analytics. It's right in your raw logs. But it's expressed as "page=" rather than "number=" which raises the interesting question of what happens if the visitor's search prefs are set for more than 10 items per page. Does "page 2" mean their page 2, or "between 11 and 20"?
And forget trying to make sense of page numbers in an image search. The layout seems to change every other week.
If you do show up as Page 5, it would be interesting to know whether the user patiently clicked one item after another through those intervening pages-- meaning either that the snippet text doesn't convey the right information, or :: ahem :: that the visitor is stupid-- or did they keep scrolling through the SERPs until they got to your snippet and said "Aha! That's what I'm looking for!"
And then you'd want to know whether the said visitor entered an anomalous search phrase, or whether most people using the same terms and getting those same results were likewise looking for something buried on page 5? Did the others just give up, or did they glance at page 1 and try a different search phrase?
I think you have to look at the whole picture.
If you don't have quality click/conversion tracking setup then you really have no way of knowing.
From my experience in the past, most traffic coming from deeper in the search is low quality/spammers. There is a big difference between #1 and number #7 for example. Not only do you get more traffic with #1, but you get a much higher percentage of buyers/joiners. Each query is different, but this is a common thread among most keywords.
These days, g has made it impossible to really determine your exact organic rank and I'm sure this was by design. If you want to play, you have to pay.
User search behavior could be changing, but I doubt it's changed that much. I still think the majority of buyers click only the top few links and are too impatient to dig much deeper.
|Wondering if there's some way to use that information to get a general trend of what searchers are doing, and whether they find the first pages of results worse. |
I don't have any good ideas about how to measure overall SERP "quality" with access only to the level of data that I do. At a mass scale we'd probably need to combine data about traffic coming only from Google Search (who makes that available?) with SERP data of the kind that Searchmetrics and Systrix collect.
Other than a huge data mining project, we only have anecdotal measures from our individual sites. I've been noticing those "deep clicks" for several years and they really puzzle me. It doesn't seem like they could be real user behavior. Over the past couple years I've been suspecting some kind of automated activity - or possibly someone's MTurk job.
|I've been noticing those "deep clicks" for several years and they really puzzle me. It doesn't seem like they could be real user behavior. Over the past couple years I've been suspecting some kind of automated activity - or possibly someone's MTurk job. |
You have seen the quality rater handbook, it's essentially rigged:
"this would be a good match for this query.
Now , based on our guidelines, rank the results for this query." It shows if Google reached its stated goal, not if serps are better to average users.
Maybe they use the number of click in ads, more clicks in ads normally might mean worst SERP quality (with some caveats) and they seek to improve so users get the answers in SERPs based on that.
Or count number of people hitting the back button coupled with the time and compare it?