homepage Welcome to WebmasterWorld Guest from 54.204.128.190
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Does the User Experience Affect Rankings?
kidder




msg:3860305
 10:31 am on Mar 1, 2009 (gmt 0)

I wanted to get some feedback on where the current thinking is on how the "user experience" is impacting on current Google rankings. As things stand Google must have a lot of detailed information on how users behave on a variety of different sites, they have adsense, toolbar & webmaster tools / analytics working for them. Here is my question: If I have a site that is "full house" google, WMT, analytics, sitemap etc and Google can clearly see my user experience measures well does this offer us some advantage in the rankings?

 

aristotle




msg:3860561
 8:30 pm on Mar 1, 2009 (gmt 0)

There's been a lot of speculation about Google using visitor behaviors as ranking factors, and it seems like a logical step. But it could be tricky to implement, so they might just be in the testing stage. I don't think there's much hard information about exactly what they're doing.

kidder




msg:3860635
 10:54 pm on Mar 1, 2009 (gmt 0)

All things being equal it makes sense at if a site has a low bounce rate and a higher rate of returning visitors it should outrank a site that does not "appear" to deliver the the same quality. I think Google needs to back webmasters who build their sites for users and can demonstrate real quality, they have been telling us this is what the want so it's about time they supported it via the rankings if they don't already. Scoring a site in this manner pretty much takes the game away from the spammers.

signor_john




msg:3860649
 11:36 pm on Mar 1, 2009 (gmt 0)

All things being equal it makes sense at if a site has a low bounce rate and a higher rate of returning visitors it should outrank a site that does not "appear" to deliver the the same quality.

It's the "all things being equal" part that's hard. To use bounce rate as a metric, Google would have to do an apples-to-apples comparison, or the results of the comparison would be worthless (or worse).

tedster




msg:3860650
 11:39 pm on Mar 1, 2009 (gmt 0)

Bounce rates and other user action metrics have most definitely been studied by Google. For the past few months there's been quite a bit of webmaster and SEO buzz around the topic.

However, several Gogole spokespeople have said they found these signals too noisy to be effective in their algos. In addition, these numbers would also be very vulnerable to spamming. Even within the environment of a single website, web management usually follows trends rather than the "hard numbers".

My prediction is that data generated from user clicks will not ever be part of any automated relevance algorithm. Many times, user clicks do not represent actual engagement. Such data can be one tool for improving satisfaction with your site -- but that satisfaction is better measured in other ways by the search engines.

kidder




msg:3860662
 11:48 pm on Mar 1, 2009 (gmt 0)

User clicks and user experience are not the same thing - are you also ruling out user driven "quality" based results?

tedster




msg:3860668
 11:55 pm on Mar 1, 2009 (gmt 0)

How are you proposing to measure user experience then? Doesn't a metric require a click?

kidder




msg:3860674
 12:05 am on Mar 2, 2009 (gmt 0)

At some point it probably does yes but it may be in the form of a "vote" but that is probably never going happen. The point I was trying to make here is that Google wants us to build good sites but in so many ways the alog allows a fair bit of junk to slip through. If the end user experience is so important then it makes a lot of sense to me that the maths supports the fact.

buckworks




msg:3860707
 12:38 am on Mar 2, 2009 (gmt 0)

You could expect your user experience to affect your rankings eventually, even if the search engines couldn't assess it directly. How? Because the quality of your user experience would affect the number and kind of sites that started linking to you .... or not.

That wouldn't make an immediate difference, but the indirect effects would would accumulate over time and either work for you or against you.

aristotle




msg:3860711
 12:41 am on Mar 2, 2009 (gmt 0)

It seems to me that some types of user behavior, such as spending significant time on a site, then bookmarking it as a favorite and returning to it later, could be a useful indicator. Some people say that Google developed its Chrome browser to be able to collect this type of information.

pavlovapete




msg:3860726
 1:29 am on Mar 2, 2009 (gmt 0)

I'm at a loss to understand how Google could measure relevance of the SERPs without any user experience feedback.

kidder




msg:3860729
 1:44 am on Mar 2, 2009 (gmt 0)

Right now it's pretty much measured by links which is a very gameable messy user based "method" if you like. Google should be in the business of getting the best most relevant sites into the top positions. As things stand I can write a page of content on a buying a blue wideget, put in some adsense code and buy some links to the page and out rank the guy who actually sells the blue widgets... If the user had his say my page would in most cases be quickly moved out of this equation unless it added some very real value..

Robert Charlton




msg:3860738
 2:02 am on Mar 2, 2009 (gmt 0)

Links and PageRank are in effect user experience feedback. As such indicators get skewed by attempts at manipulation, Google tries to refine the feedback.

For a while I felt that the Yo-Yo rankings we were seeing were based on some measure of user satisfaction, but bounce rates, eg, would be an extremely ambiguous signal at best. In some cases, eg, a short time on the page might not be a suggestion of dissatisfaction at all. It might suggest you got what you wanted quickly and left... or that you liked what you saw and bookmarked it.

tedster made a fascinating suggestion in the Traffic Throttling [webmasterworld.com] thread, to the effect that some form of traffic throttling (manifested in Yo-Yo rankings) may be a way of essentially normalizing traffic so Google can assess the rate of backlink growth and see if it looks manipulative.

In my wilder moments, I see this as a human editorial review factor. "Yes, the algo says this url should jump up. But when I eyeball the backlinks, something smells funny. Let's cap traffic (put the site on a yo-yo ranking) and see if the backlink growth sustains itself. Remind me to check it again in 3 months." Or so goes my fantasy.

This would be an attempt ultimately to tie link growth more closely to user experience. Google probably tries to correlate all factors it can measure, looking for patterns that hold up under different scenarios.

signor_john




msg:3861015
 3:28 pm on Mar 2, 2009 (gmt 0)

If the end user experience is so important then it makes a lot of sense to me that the maths supports the fact.

Google does use human evaluators for benchmarking purposes, and maybe to provide input for an automated "black box" that looks for commonalities between great/so-so/spammy pages. And wasn't there a Google patent application or paper a few years ago that talked about being able to use a few hundred handpacked "seed" sites to project trust (via six-degrees-of-separation links) onto sites across the Web? Concepts like those are far more sophisticated--and far less subject to manipulation--than brute-force "time on site" or clickthrough metrics.

londrum




msg:3861038
 3:48 pm on Mar 2, 2009 (gmt 0)

we can imagine the kind of things which will mess up their user experience stats. a user might open five sites in his browser's tabs, and have four of them sitting there for twenty minutes before he even looks at them. how are google going to work out the bounce rate from that.

someone might also go off to the toilet for ten minutes after loading a page, or get called away to answer the phone before he's finished reading it.

the stats that google get about stuff like that can never be trusted, i don't think, unless the traffic is so high that all the peculiarities get evened out. so i don't see how they can trust it in the algo.

julinho




msg:3861048
 4:05 pm on Mar 2, 2009 (gmt 0)

Google clearly expressed that on-page factors and PageRank had shortcomings, and there was a need to develop other techniques.

Methods and apparatus for employing usage statistics in document retrieval [appft1.uspto.gov]

[0008] Each of these conventional methods has shortcomings, however. Term-based methods are biased towards pages whose content or display is carefully chosen towards the given term-based method. Thus, they can be easily manipulated by the designers of the web page. Link-based methods have the problem that relatively new pages have usually fewer hyperlinks pointing to them than older pages, which tends to give a lower score to newer pages.

[0009] There exists, therefore, a need to develop other techniques for determining the importance of documents.

How are you proposing to measure user experience then? Doesn't a metric require a click?

From the patent: "those skilled in the art will recognize that there exist other such type of information and techniques consistent with the invention"

Google certainly have lots of people skilled in the art (as well as an Army of Statisticians) to implement the patent.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved