Here's a link for the article - which is now four years old. [seomoz.org...]
By now, I'd say it's foregone conclusion for many that Google is using behavior data - just where the data comes from is the question.
I suspect that it's harder to manipulate these days than it was in 2007. The safeguards on false clicks on Adsense, for instance, have taught Google a lot about how to find manipulative patterns in any click stream.
If users were spending just 30 seconds on the site, I do say that it would have been disastrous for it. (I read it as 30-60 minutes)
GOOGLE DOESN'T USE ANALYTICS AND THEY CANNOT RELY ON THAT DATA.
but they can surely read this behaviour in a number of ways.
Bing does it too. I tested it by accidentally ruining the template of a very popular page (for Bing) within a day or two the visits to that page stopped. Fixed it and then got another visit (to test the waters ?) and then a day or two later they came back.
The problem is that my BR is much worst with Google, due to the crappier, third tier keywords they send. The direct visits have about 15% lower than Google. Last time I checked, Ask.com sent the best one and then AOL.
Can't speak for "users" but I rarely spend more then tens of seconds in any page.
|If users were spending just 30 seconds on the site, I do say that it would have been disastrous for it. (I read it as 30-60 minutes) |
|The problem is that my BR is much worst with Google, due to the crappier, third tier keywords they send |
walkman, that is one major issue with google right now (especially for pandalized sites). Since they are moving away from presenting results with words that the searcher users, by guessing intentions, synonyms, etc. they do send a lot of those crap visits for those third tier keywords. Earlier, I used to see popular pages getting visits from several combinations of keywords present on the page title or content.
Now they have deliberately cut down this traffic and are sending a number of visits for words that aren't present on the page, resulting in bad user experience and bounce rates as you mention.
Their 'guess game" is surely having a big role in killing pages on pandalized sites..
[edited by: indyank at 5:48 am (utc) on Sep 11, 2011]
|Can't speak for "users" but I rarely spend more then tens of seconds in any page. |
so you are also contributing towards the success of panda :)
coming back to that article, I strongly disagree with its claim that google analytics is being used.
But feedburner might have a small role to play...as it can give them signals whether the site is posting stuff relevant and useful for its loyal subscriber base...I do say that they would have "profiled" those subscribers, if they happen to use google reader or gmail.
New sites get a baseline that's probably significantly lower than the long term baseline will be. User metrics can 'level up' the rank of your page but there is a glass ceiling in place and many other metrics that also have their bonus scales. You can't dominate just one to the top, it seems you need to be better than the competition in several.
sgt, I am seeing this on 11 year old sites too. I am not sure since when google is collecting or using those metrics but the effect is similar on sites irrespective of their ages.
I am also not sure on how they collect these data. Some claim that it is through google toolbar but like every other argument that one dept. in google might not have access to data mined by another dept,we can make a similar case for this one too. They could even be deriving these metrics from the content, but who knows.
Google's main source of data for user metrics is obvious, and I don't understand why people on this forum keep ignoring it. It's the Chrome browser. It has tens of millions of regular users, easily enough to allow it to collect statistically-significant data. Unlike Analytics, it can collect user metrics for ANY SITE. I'M NOT TALKING ABOUT BOUNCE RATE! Instead, I'm talking about truly useful metrics such as:
-- A user bookmarks a page as a favorite
-- A user saves a page on their hard drive
-- A user prints out a copy of a page
-- A user returns to the page later
-- A user visits other pages on the same site
I don't understand why people keep talking about bounce rate when these other signals are obviously much more reliable.
To indyank...yeah I made a typo and 30-60 seconds should be minutes.
To aristotle, I definitely think chrome is a factor in all this, but in my case few users use chrome and I can definitely record this effect for non-chrome users.
|-- A user saves a page on their hard drive |
-- A user prints out a copy of a page
I would think that the above would be statistically insignificant data to influence or play a role in google's algos. How many users would you expect to save a webpage on their hard drive or print out a web page? This data would be significant to make some sense only in situations such as when you offer downloadable forms or templates...
Why should saving a webpage to a hard drive be considered as a positive signal?
While it would not surprise me that the correlation between bounces/time on site relative to ranking changes was not positive but rather a spurious relationship, I can't think of a better source... More specific user actions (like printing/bookmarking) would which while probably measured not be large enough in small cases like mine and the author's to account exclusively for google's behavior bump.
I don't see other visits to other pages on site being a factor because my test case is a one page website (yet I receive all types of bounces/time spent one site in GA).
Visitors returning to your page is I believe a strong possibility (because if nothing else GA tracks this per keyword so we know google thinks this is very important data).
My best guess is that is that google's various tracking services are working together to give the big picture of what users/websites are doing. I suspect if a user goes from my site that uses GA to another site that uses GA, that they could for example calculate the time on site based on those two services. Other sources would be returning to the serps which obviously google tracks...going elsewhere while having the google toolbar or perhaps using another service like gmail (which may signify the user is done with your website).
The big question is how this is all usable for SEO. I do think like Tedster said this would not be easy to manipulate such as just spam clicking your page from the SERP's. Google probably detects what is a unique user based on IP and/or your google account...and probably has an organic filter that eliminates unnaturally high click spikes that statistically have too much volume.
I do think however that if one could pony up enough friends (for competitive terms this would probably have to be many thousands IMO) and I think in theory this could work. It might be interesting to test this, as in GA right next to each keyword it displays the recorded bounce rate and time on site for each search term so you might be able calibrate what google determine to be spam clicks based on this.
Probably the most practical solution however would be to make sure that A...users are clicking your link from the SERP's. If this is the case, meta descriptions no longer would have to be designed for googlebot but for humans and would have to have very catchy phrases ('free widget', 'top secret information','earth shattering exclusive report'...that type of thing) .
Once on the site you then need to manipulate users into staying longer than they normally would. I don't know if splash pages/glorified menu pages/teasers would be the key as I suspect page clicks aren't what google tracks anymore and that they have a way of calculating time on site regardless of this. Perhaps a online game section on your website which you stuff with free flash games to up that time spent on website? Freebies but with bureaucratic time consuming forms/processes to obtain? This is a toughy...
Definitely think it would be stupid for google to measure this type of thing as often I get what I want from a well designed website rightaway and then go back to google for a new search...while with bad designed websites I can spend a lot of time looking for my needle in the haystack and yet never find it...but in google's eyes I could perhaps be voting for this type of site.
|It's the Chrome browser. It has tens of millions of regular users, easily enough to allow it to collect statistically-significant data |
Surely the demographic of Chrome users is too narrow for it's usage stats to be widely relied upon? Until you can include the popular "everyday" browsers the data would be skewed in many sectors/verticals IMO.
[edited by: Simsi at 2:12 pm (utc) on Sep 11, 2011]
indyank - You need to realize that Chrome has tens of millions of regular users. Just because you may not save pages, or print them out, doesn't mean that nobody else does. In fact I do it all the time.
Simsi - Data collected by Chrome could be slightly skewed (or non-representative), but if you're thinking that most Chrome users behave quite differently from everyone else, I seriously doubt it.
It was just an observation really. Thinking maybe that webmasters (potential competitors or co-operatives) are more likely to use Chrome and their behaviour paths might resultingly be very different to many users on some sites.
aristotle, I believe there wouldn't be many who perform those actions. (saving web pages to hard disk and printing a page). Since you are one who does that, is there any reason why you save a webpage to the hard disk? and why do you think it will emit a positive signal? I just copy and save urls to a desktop tool i maintain. There may be other ways that people do to store a page for future reference. Would all these actions emit the same positive signal?
|Once on the site you then need to manipulate users into staying longer than they normally would.....Perhaps a online game section on your website which you stuff with free flash games to up that time spent on website? Freebies but with bureaucratic time consuming forms/processes to obtain? |
smithaa02, you are essentially talking about manipulation there, which might not help in the long run.. :)
forget the browsers and the tool bars and they wouldn't be truly representative of an average surfer...
but they can still read user behavior from actions performed on their SERPS. Their cookies are working hard to collect all the user behavior data. one of the best interviews I had read recently on this is here and you would get an idea of how they do it - [stonetemple.com...]
And I must say that this guy Duane Forrester is far more transparent in sharing things and the stuff he has shared is more useful...I am sure many of those points are applicable for Google as well.
[edited by: indyank at 3:42 pm (utc) on Sep 11, 2011]
|but they can still read user behavior from actions performed on their SERPS |
That's a fair observation - there are really two types of user behaviour I guess... one in relation to the searches performed and one whilst "on-site". The former has to be the easiest for Google to analyse I would have thought and arguably, the clearer signal.
"Of course Google uses user behavior" - said to me by a Googler at the Google Dance at PubCon / New Orleans in 2005. He went on to say that they don't use it to directly effect the search results because that's too easy to manipulate and besides, they don't own the patent on that.
Google has been yanking our chains for years. They tell us that they have a new magical algo that detects when a backlink is on an off-topic page, and they devalue it. Webmasters go crazy trying to figure out how they could possibly get this right.
Wake up. It's not magic. They know when we click on links because AdSense is present on 70% of the destination pages, or the user is using the toolbar or Chrome. They don't devalue these links, it's just that people rarely click on these links.
Side-wide links are devalued? Nah, they are only valued if users click on them. Users rarely click on them.
In-network links sometimes are valued? In-network links are often clicked on in certain networks. Abracadabra. They have value.
The real mystery is why more webmasters don't already understand this. I guess it's because we are in awe of the hocus-pocus Google says they are performing.
Oh yeah, I left out page load time.
Google has been saying for years that to rank well, we need to provide a better end-user experience. AND they've been saying that page load time has a profound effect on end-user experience.
We webmasters have been ignoring this for the most part. Finally Google announces that they will eventually begin using page load time in their ranking algo. I bet they don't add this at all, it's already there as part of their many other benchmarks that test for end-user experience. Page load time effects many of them already. They've been telling us this all along. We just haven't been paying attention.
What's even worse, when Google comes right out and says it will effect ranking, instead of doing something about it, many webmasters just sit there and yell, "that's not fair!"
site speed certainly isn't a killer. Huff Post and a few other 'new media' sites take over 10 seconds to load (mostly due to the gazillion outside js ads) and in most cases outrank the site they took the paragraph from. When I search for a new story, HuffPost in most cases is on top of Google SERPs. Not sure what that means with the SERP skewing personalization but I rarely visit them. I don't advise trying to imitate them since more speed is at play but my site was always at 1-3 seconds. Most people pay attention to it, Google or no Google
You're coming at it from the wrong angle there. I think what dataguy is saying, and what I've been thinking for a while, is that Google measures the effect certain metrics have on user behaviour, not the metrics themselves. So you won't get penalised for a slow page speed in itself, but you will if it increases your bounce rate. Because it doesn't affect Huffington Post's bounce rate, because they are Huffington Post and surfers are prepared to wait for the pages to load, it doesn't affect their ranking. But if it were some unknown blog with a slow page speed, surfers are much less likely to wait around, and that is what Google measures.
I understand what you're saying and it has been clear that outside signals have been used and that big, popular sites were exempted from it. What this has done is kill the 'middle class' businesses with straight to the point info and rewarded those with pop-ups and sign here and click there, irrespective of content. And we're talking about killing sites, not pages. Of course on the niches that Panda run.
This also lets 'good' sites get away with lots of junk in the SERPs since the average is still what Google considers good, while good pages from 'bad' sites are pushed down.
Lastly the junk traffic G now sends sure isn't going to help anyone improve metrics. But Google is happy.
Ah, the benefits of essentially having one monopoly controlling everything and changing whatever they want, whenever they want to whatever they want. As long as it works for them...
As big as all our sites might be, and how immense and clever our internal stats, we ain't but a gnat bump on Bing's a$$, much less Google's as regards user data. And chrome is bigger than most of us, too (and that GTB everyone (not me) is using for that hoary old depreciated PR)...
Google has left us behind... Bing might do that, too, sometime in the future, after all, the WEB is COMMERCIAL these days and think Madison Avenue (back in the day)...