homepage Welcome to WebmasterWorld Guest from 54.196.189.229
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google and User Metrics - is that why the SERPs are so poor?
coachm




msg:4470545
 5:22 pm on Jun 28, 2012 (gmt 0)

Lots of discussion on whether and how google might be using user metrics as part of their ranking algos. Been driving me nuts.

The short: IF google is using visitor metrics as a significant part of their algo, it might explain why the serps seem to have gotten worse, because, visitor metrics are an exceedingly poor and circular way of evaluating sites.

The long:
User behavior on a site is complex. But basically, it comes down to whether the page yields what the user is looking for. In effect, it's a result of the match between want and find.

Simple.

But here's the thing. What affects that match? Certainly there's on page stuff -- junk is not going to fit a match. BUT the major determinent of match is up to Google.

Google chooses both the title and serp description, so if it changes things (which it has started doing, and does a poor job, match gets worse.

More important google the degree that Google can get in the searcher's head will determine whether the results are a match for the user's intentions. To the degree google succeeds, there's a match (all things being equal). If it doesn't get it right, then the sites it shows (and are clicked) are going to get high bounce/exit rates.

In effect, GOOGLE determines the fit.

Other factors also enter in to it, of course for various types of visitors. Inbound links, for example affect the match (even if Google is left out of the equation. Good, descriptions and links allows good matches. Then good user metrics. Bad descriptions and you get more unmatched traffic.

The point being that while user behavior certainly has something to do with onpage factors, it's NOT under the control of the webmaster. You can have an amazingly engaging high quality website, but if Google pushes the wrong visitors to it, you'll get poor user metrics.

So, user metrics have much less to do with quality, or even if people like a page, but have everything to do with the match, which is heavily controlled by google.

So, if google relies on user metrics, while it may be evaluating some onpage aspects, what it's really doing is actually evaluating ITS OWN ABILITY TO MATCH USER INTENT to page content.

So it goes round and round. Google has trouble with certain pages and sends irrelevant traffic. High bounce rates. Google sees high bounce rates and then drops the page in the SERPS REGARDLESS of quality.

You can't have an algorithm assess something, when in fact, you are evaluating the success of the algorithm itself. It doesn't work. It's a logical error. It's, in effect, a weird loop.

PS. One could argue that it's the webmaster's fault if Google can't get what the page is about, or the searcher's fault if s/he can't search well. Both could be true, but for the first, given the last 18 months, I don't believe anyone who says, the know how to tell google what their pages are about so they get the "right" visitors.

Ok. Oversimplified a bit. And if Google is NOT using user metrics much (and it shouldn't) then all bets off. But if it IS, it might explain why absolutely dreadful sites are showing up.

 

tedster




msg:4470551
 5:43 pm on Jun 28, 2012 (gmt 0)

I'd say Google and Bing are both working with user metrics. Or more precisely, I'd say they are both LEARNING how to use this data, as well as what data to collect. In Google's case at least, I think they are intentionally sacrificing SERP quality in the short term so they can learn what they need to learn for the long term. They need to replace dependence on all the ranking factors that can be easily manipulated.

In fact I think this machine-learning has been going on for several years already and it's now accelerating. Someone here recently posted that "search is still in its infancy" - and it is. So we shouldn't hope for the old days, just because we like the familiar feeling it gave us. Search didn't work all that well back then, either ;)

coachm




msg:4470555
 5:48 pm on Jun 28, 2012 (gmt 0)

Ted, do you have any thoughts on how google can use user metrics without ending up in actually evaluating itself circularly? Since behavior is so linked to who google sends, I'm not sure how you'd get around that?

tedster




msg:4470560
 5:57 pm on Jun 28, 2012 (gmt 0)

Machine learning is all about circularity of many kinds - and Google PREFERS machine-learning over any other approach, it seems. Yes, these days Google can unintentionally sabotage a site with the kind of thing you describe. Over time, they will learn how not to do that. I'd say it's happening less than it did already.

And in the meantime, I think we all need a hard-nosed acceptance of what Google's job really is. They're not trying to rank ALL the best pages - just some good pages that give their users what they are looking for. Some great pages definitely slip through the cracks. They always did.

aristotle




msg:4470567
 6:13 pm on Jun 28, 2012 (gmt 0)

Yes, Google sometimes mis-matches a page to what the searcher really wanted. In such cases the user metrics will tend to show that it is a mis-match. So The Google algorithm will learn that it's probably a mis-match and drop the ranking of the page for that particular search term.

On the other hand, sometimes the page will be a good match for what the searcher wanted, and the user metrics will so indicate. In this case, the algorithm will improve the ranking of the page for that term.

So over time, as more user data is collected, the number of mis-matches will be reduced and the SERPs should improve.

santapaws




msg:4470578
 6:35 pm on Jun 28, 2012 (gmt 0)

its simply not true that search is currently in its infancy. Its actually already past maturity and gone to the old folks home. There is simply no way google will ever have an unbiased organic search again. The model is clear, to wean users off the idea of what they expect from a search engine so they can make more and more money off the back of that. Going forward you will not find a giant leap in AI giving you amazing results from the organics. What you will find is more and more intelligently focused results for their own products, less and less choice. And of course more and more ever more loosely related results.
Information will be served from google approved sources such as wikipedia and google circles. You will find all the old mom and pop info sites gone. They will not be found by anyone who deosnt already know they exist. Dont forget search will move away from the desktop and onto life devices such as tv's and phones where most results will just be served on a spoon. Like the often repeated google mantra of the right answer. This will be what people in the future will expect, they ask a question and get one result served, most probably an answer with a commercial affiliation/benefit to the supplier of that result. So the idea of search being in its infancy simply doesnt hold water, for google search means something totally different now.
Yes search did indeed work in the old days. You searched a phrase and got 1000 results focused on what you searched for. Yes lots of spam along with it but lots of right results too. Much like cameras at high isos, if you want a detailed image you have to live with some noise, turn on noise reduction and while you image might look a little prettier you have lost all the detail with the noise reduction.

Leosghost




msg:4470582
 6:48 pm on Jun 28, 2012 (gmt 0)

I think a large part of their problem is that they have forgotten the balance in the equation*..

They are too busy trying to track the visitor and what the visitor is doing ..and have taken their eye off the quality of the websites that are actually being put to the top of some of their serps..

Or as I said in another thread they are giving ehow a pass for some reason which is IMO, either it makes them shedloads of money ..or they are one day going to absorb it and use it to make them even more..

Or it may well be as santapaws says and they have the serps they want, and are already moving to tell us what they want us to know, and are "window dressing" the "telling", whilst monetising it as fast as they possibly can..

Panthro




msg:4470608
 7:55 pm on Jun 28, 2012 (gmt 0)

@santapays - I understand your point, but if search is in the old folks' home then we should be having a funeral soon. That may be the case, but I don't personally believe search is dead. It may become something unrecognizable from what it was at its birth, and in many ways, it already is, but that does not make it dead.

Google search needs to keep bringing in loads of money to GOOG through advertising or whatever other ways they come up with to monetize it. To bring in more big money, it needs to get more serious and that means looking more like TV, imo, with the big boys dominating the space.

This will be what people in the future will expect, they ask a question and get one result served, most probably an answer with a commercial affiliation/benefit to the supplier of that result.


And I think you're right on with this. I always imagine the future to look a little something like this - [youtube.com...] (can't find the full clip from this episode!)

londrum




msg:4470616
 8:04 pm on Jun 28, 2012 (gmt 0)

if we relied on user metrics in other walks of life, like rating restaurants, then the best restaurant in town would be McDonalds, because they have the biggest brand, highest footfall and return visits -- all things which google presumably uses in their algo.

McDonalds is probably the most popular restaurant, for sure, but that doesn't mean that it's the best. maybe google is confusing the two words. or maybe they have made a deliberate choice to return the most popular sites, instead of the best (which is much more likely, in my opinion). that's probably why big brands are getting a boost.

tedster




msg:4470638
 8:28 pm on Jun 28, 2012 (gmt 0)

That seems over-simplified to me. Clearly we're not just talking about the volume of users. User data for the "best restaurant" would include more than volume. And I'm sure Google and Bing would both use a lot of user data signals, learning over time what they indicate in real-life terms.

At the same time, I do get your point - that certainly would be one type of trap that search engines need to avoid. The potential error seems so obvious that I'm sure they are taking the possibility into account.

londrum




msg:4470641
 8:32 pm on Jun 28, 2012 (gmt 0)

what other metrics are there... backlinks? McDonalds have the most column inches in the newspapers and on TV too. that is the real-world equivalent of backlinks.

They make the most sales, the most profit (probably).

The only thing that McDonalds couldn't win is a human review.

onepointone




msg:4470653
 8:42 pm on Jun 28, 2012 (gmt 0)

Personally, I don't mind the McDonalds of the world being listed 1st. that much.

What I don't like is the same site listed over and over on the same page. (fancy name host crowding I guess..)

Can you imagine a travel guide, restaurant directory, or any kind of directory, 'real' yellow pages, any of us 'peons' websites, etc.,... any of those doing that? They'd be laughed at.

G can put up dancing hamsters and it's all good....

aristotle




msg:4470659
 8:46 pm on Jun 28, 2012 (gmt 0)

To some people, "best" could mean fast service, low prices, tasty food, relaxed environment, and playgrounds for kids. If they keep coming back (repeat visits), that would be a signal of their satisfaction.

Leosghost




msg:4470662
 8:48 pm on Jun 28, 2012 (gmt 0)

Dancing hamsters ..mmmmmmmm ..tasty..grilled or fried ? :) it may be "all good" , but some cuisine and recipes are better..than others ..

santapaws




msg:4470802
 8:32 am on Jun 29, 2012 (gmt 0)

leosghost now your talking variety and i think ties in nicely with the host crowding thread. Focused variety is what the mature google algo gave and which has now sadly been retired. The smoke and mirrors rhetoric we here constantly now about it all being rosey in the future for a little pain in the short term (and yeah thats the oldest political trick in the book) is another way of saying the future means looser and looser related results with less and less variety.

tedster




msg:4470866
 11:46 am on Jun 29, 2012 (gmt 0)

It's going to come down to what users respond best to. I'll be personally upset if it stays like this for too long and Google still maintains market share. I know I'm not the "average user", but I don't think I'm different by THAT much.

santapaws




msg:4471030
 6:17 pm on Jun 29, 2012 (gmt 0)

can i ask what you would expect to see next if things are said to have improved?
we have been searching for life across the universe for decades yet dont even know what it really means. It becomes quite convenient to have a moving target to aim at, wherever you shoot, move the target there. What puzzles me is at what point, at what tweak did google decide they just made a backwards step for the sake of the future. I dont recall any announcements about this or that algo push will see things appear worse for now. I only recall each tweak being followed by an announcement of improvements.

coachm




msg:4471059
 8:18 pm on Jun 29, 2012 (gmt 0)

Aristotle:
So over time, as more user data is collected, the number of mis-matches will be reduced and the SERPs should improve.


Theoretically, but how does google know whether poor user metrics are a result of faults in the algo. or in fact, because the page/site is poor?

Identifying why user metrics are poor for a site doesn't result in better serps, unless the cause IS the algo.

coachm




msg:4471064
 8:25 pm on Jun 29, 2012 (gmt 0)

Let me see if I can clarify. I own superyelp, the largest restaurant review site, with 70% of the market. Most users don't know there are other review sites.

In a drunken stupor one of our professional reviews, lists the Macdonalds as offering gourmet food and a romantic atmosphere.

Now, as a result 100 people (who don't know what a McDonald's is) head over. Some of them will be disappointed just at seeing the place, and leave. (bad user metrics) Others order, taste the food, and leave after a minute or two....

So, Superyelp says: We want to improve our site, so let's look at the usermetrics. But they don't know whether the poor metrics are a result of a drunken fellah, or that the McDonald's really sucks. If they misread, they have problems.

But more...let's say they look at the metrics, and assume that it's the fault of the McDonalds, and downgrade its rating so it appears rarely. The don't just penalize for the terms but overall. So, although the fault is with the drunken reviewer (it's own system), they effectively remove "in its entirety" visibility for the McDonalds.

If they downgraded the resto just for the terms gourmet or romantic, that makes sense (almost like downgrading a single page), but if they globally reduce visibility for any search terms (a la downgrade an entire web domain), then they will end up with worse serps over all.

?

aristotle




msg:4471073
 8:45 pm on Jun 29, 2012 (gmt 0)

Theoretically, but how does google know whether poor user metrics are a result of faults in the algo. or in fact, because the page/site is poor?


If you read my whole post, you will see that I was talking about mis-matches. And yes, you're right -- many of the mis-matches are Google's fault. But it's a mis-match between the page and a specific search term. For a different search term, the page might be a good match, and in that case the user metrics should usually be better.

lucy24




msg:4471087
 9:41 pm on Jun 29, 2012 (gmt 0)

One could argue that it's the webmaster's fault if Google can't get what the page is about, or the searcher's fault if s/he can't search well.

The first argument is easier to support if the webmaster knows what people are searching for. If all you know is that the visitor searched for, uh, something ... then you still don't know if the searcher is an idiot* or if there's some aspect of the page that needs a simple tweak.

Oops. Different thread.


* The visitors to one of my e-book pages are idiots. The thing's got a description, fer ### sake. But I note with interest that google has apparently taken a dislike to me, because an exact-title search really should crop up a lot higher. :( Maybe they don't realize that all those other pages are the identical book and will get equally swift bounce rates. Or am I running afoul of filters by saying that the book doesn't contain {what the searchers think it contains}?

santapaws




msg:4471264
 1:01 pm on Jun 30, 2012 (gmt 0)

am i being old fashioned to think it would helpful for google to analyse a page for ranking benefit and not just for penalty trippers? what happened to all those clever patents that looked at the actual content of a page and could tell if it was strong for a term? like the infant google did. I mean it would bin all those ridiculous facebook results for a start. They wouldnt need a special algo to just find parked pages either.

gehrlekrona




msg:4471279
 2:14 pm on Jun 30, 2012 (gmt 0)

What I tried to say in my last post was that I am sure Google is using user metrics and that more than we think they are. I think this is one of the reasons SERP's are changing all the time and that Google suggests keep changing all the time. Google can steer you into whatever they want to get metrics to evaluate and that in real-time. They don't need a Like button or G+ to see what pages people like or not, they can just use user metrics. Like soneone said, there is a lot of room for errors because they don't really know why someone didn't like a page so the results could be off.
In fact, I think that this is why Amazon, Wiki and other brands stick because they are "liked" by people. They go there and stay there.

diberry




msg:4471294
 3:30 pm on Jun 30, 2012 (gmt 0)

So, if google relies on user metrics, while it may be evaluating some onpage aspects, what it's really doing is actually evaluating ITS OWN ABILITY TO MATCH USER INTENT to page content.


Very well put.

In Google's case at least, I think they are intentionally sacrificing SERP quality in the short term so they can learn what they need to learn for the long term. They need to replace dependence on all the ranking factors that can be easily manipulated.


And this makes good sense. But to make this a success, isn't Google eventually going to need a lot more UMs than they can collect from Chrome or buy from ISPs? I mean, if they had access to full, un-fudged analytics from every site, they could easily tell when the bounce rate is their own fault or the site's. But even if they used the data from Google Analytics, that would be incomplete and subject to manipulation. So how do they reckon they'll ever have a complete enough set of UMs on enough sites to produce quality SERPs this way? Forgive me if this has been discussed ad nauseum elsewhere and I just didn't see it.

tedster




msg:4471310
 5:04 pm on Jun 30, 2012 (gmt 0)

As I see it, they have great confidence in their machine learning over large data sets to eventually reach a high level of accuracy. I think both Panda and Penguin were created via machine learning, and even earlier, lots of smaller bits of the algo were too.

In fact, one company I work with spent the last three years training their own machine learning algorithm to clean up large masses of data. The success has been fantastic, though accuracy is not 100% - nothing ever is, after all. It's just that machine made errors are often quite laughable compared to human error. Remember IBM's Watson on Jeopardy? It trounced the human players, but when it made a mistake it was a doozy!

indyank




msg:4471316
 5:38 pm on Jun 30, 2012 (gmt 0)

In Google's case at least, I think they are intentionally sacrificing SERP quality in the short term so they can learn what they need to learn for the long term.


But they never claim that. They actually claim the quality of SERPS is far better now and their users are loving it. Are they lying when they make such claims while, as you say, they intentionally sacrifice SERP quality in the short term so they can learn what they need to learn for the long term?

gehrlekrona




msg:4471318
 5:56 pm on Jun 30, 2012 (gmt 0)

@indyank, that would be a loud "YES" to your question. I do not trust anything that comes out of Google anymore. They skew reality and I am not sure how they can measure users "love" of SERP's.
I am sure, like I said before, that Google is using user metrics for their machine learning. Like someone else said, they need to go away from anything that can be manipulated by links and other old fashioned ways of measuring popularity. Now all they need is themselves.

tedster




msg:4471335
 7:15 pm on Jun 30, 2012 (gmt 0)

But they never claim that. They actually claim the quality of SERPS is far better now and their users are loving it. Are they lying when they make such claims

Actually, I think it was Amit Singhal who did say something like that last year. Unfortunately, the search results are so rough right now I can't find the link to it ;)

I doubt that they're lying - I've never heard an outright lie, just spin. So I'm sure they have SOME data that shows Panda and Penguin changes in particular have helped their user stats.

There isn't much information around to indicate some great mass of user dissatisfaction with Google, after all - just webmaster dissatisfaction.

santapaws




msg:4471439
 6:26 am on Jul 1, 2012 (gmt 0)

a lot of people coming online now have nothing else to compare the serps too. A lot of people dissatisfied right now are webmasters yes, but because these are the very people who have good points of reference.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved