Welcome to WebmasterWorld Guest from 34.229.194.198

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Click-through Rate, Pageviews, and Time on Site

     
5:31 pm on Nov 5, 2015 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 30, 2002
posts:522
votes: 0


It's pretty well accepted that how the user interacts with your page will impact how google will rank your page in the future.

Lots of bounces could lead to lower rank.


If I have one page on a site that receives a lot of traffic, but nearly all traffic bounces. Does that have potential the impact the entire domain, or just that one page?

The page is about how to tell a real widget Vs fake widgets. By it's nature people will search, get their answer and leave.

We sell widgets. But this traffic is shoppers, they have a widget in hand and want to find out if it's fake or not.

Should I just get rid of the page?
10:01 pm on Nov 5, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2630
votes: 191


It is not the bounce by itself. I have a page with a very very high bounce rate (because of the nature of information it provides), but it did not harm the site.

Personally, I would not get the rid of the page - have you thought that it may support the site, perhaps contributing it on being the authority on that particular widget?
Also - if visitors who read the page bought a fake widget - maybe there is a chance they buy real one from you?
10:35 pm on Nov 5, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14871
votes: 478


It's pretty well accepted that how the user interacts with your page will impact how google will rank your page in the future.


The herd is wrong on that point.

All the so-called "experiments" are flawed. The experimenters admit it on their blog posts with disclosures of how links could have affected the outcome and a CYA statement about correlations not being causation. But those are usually two to three sentences out of a fifteen paragraph post. It's a deceptive practice to claim one thing while allowing in a short statement that the claims are possibly incorrect.
10:53 pm on Nov 5, 2015 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 30, 2002
posts:522
votes: 0


aakk9999 - good post, with good points to consider.


Martinibuster - are you saying that user behavior isn't considered by google? Would that not be a key method to determine what is quality content vs. what is junk?
11:57 pm on Nov 5, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14871
votes: 478


...are you saying that user behavior isn't considered by google? Would that not be a key method to determine what is quality content vs. what is junk?


No. That's not what I'm saying. I apologize, I don't mean to be opaque.

User behavior and CTR are generally used for the purposes of training the algorithm and for quality control. It is useful for determining user satisfaction with the SERPs (and identify how to better improve relevance).

This is important to understand: traditional information retrieval technologies like PageRank don't address the issue of user intent. It's just looking at links and web pages but not at people. That's where click log mining comes in, to understand what a user means when they type a phrase in a search box. There is a thing called the Rank Modifier. The Rank Modifier can use information such as your geographic area or past user click log data to understand user intent and modify the SERPs to provide a more relevant response.

Just last month Gary Illyes was asked at the SMX East Meet the Search Engines panel if CTR is a ranking factor and his response was simply to roll his eyes. That was his answer. Why do you think he rolled his eyes?
3:21 pm on Nov 6, 2015 (gmt 0)

Full Member

10+ Year Member

joined:May 3, 2003
posts:278
votes: 22


Why do you think he rolled his eyes?


Hard to say. Arrogance? Contempt for the person asking the question?
3:46 pm on Nov 6, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14871
votes: 478


Hard to say.

Easy to say. :)

Didn't think I had to spell it out but here it is: He was weary of answering that question yet again. How many time does it have to be explained before it is understood?

The CTR ranking factor rumor just won't die. Either those who promote that idea are cynically doing it for traffic, do not have the capacity for critical evaluation or are simply ignorant. Take your pick. One day we will laugh at the naivety born of lack of understanding that gave rise to the notion that CTR directly influenced rankings and was a ranking factor. Read what I read above, it's the short version of a longer explanation.

My understanding of CTR's role in the SERPs is based on reading the scientific research. I don't depend on what Googler's say because it's always lacking details (for good reason). Those necessary lack of details tend to create the impression that they're not telling the truth and perhaps that's why these rumors get started. So I'm filling in those details for you, right here, right now, so that their statements make more sense. Read what I wrote above, those are the bits the Googler's aren't telling you directly, only indirectly.

Earlier this year [searchengineland.com]:

...at SMX Advanced, Gary Illyes from Google confirmed that “Google uses clicks made in the search results in two different ways — for evaluation and for experimentation — but not for ranking.”

[edited by: aakk9999 at 5:27 pm (utc) on Nov 6, 2015]
[edit reason] Edited as per member request [/edit]

2:16 am on Nov 7, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


I thought John Mueller made some comment about this earlier too (though it might have been with respect to sharing on social media, which is sort of a similar thing - I keep reading how Google uses it and Google keeps saying "not really").

Does anyone recall? Did John Mu say that those factors were not direct factors, though they might be in the future?

However, they are useful metrics from the perspective of a site owner in that it can tell you what is working and not working with site visitors.

- High CTR + TOS indicates that your title entices clicks and the content satisfies visitors.
- Great number of social shares indicates your content is shareworthy

Those are both things you might be able to use as proxy signals, considering that if they are making visitors happy, they will attract links and other signals that make Google happy.

That said, I recently read an article where looking over analytics with heatmapping, someone (Buffer? SumoMe?) found that the most shared content was not the most read content.

Most of these signals are really, really noisy.

That said, if they are shared a bunch and read a lot, there's a decent chance those things will have secondary effects.

You can optimize CTR (though not as easy as when you could actually see which keywords brought traffic to which page). You go into Webmaster Tools, look for your highest CTR phrases and make sure to get those in your titles.

And you can game TOS in many ways, the most common being those stupid and annoying slideshows. "Ten reasons why slideshows suck. The tenth reason will AMAZE you." And then you load the intro, and each reason is 2-3 slides and you have 10 minutes on site and, of course, get served a bunch of ad impressions. If you can manage bounce, you might even increase TOS by making the font harder to read (likely to increase abandonement, but it depends on how badly people need the info you have).

Between trying to increase ad impressions, pageviews and TOS, some people are making the web unusable and I think that, in the long run, is actually likely to hurt their rankings. I'm sure the increased ad revenue is worth it and most of these places aren't depending on Google anyway - they are using social shares and purchased traffic overwhlemingly.

Anyway, I guess what I'm trying to get around to saying is that I see a couple of ways of increasing CTR and TOS. One is good, snappy headlines connected to good content. The other is crappy clickbait and annoying multi-page content. I'm very skeptical that method 2 leads to better rankings in the long run.
3:40 am on Nov 7, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:9037
votes: 752


Against the grain another man's thought: Where in all the g stuff does it say that ads are part of the page rank/serp rank? Ads are NOT content. Never have been. Bounce rate is based on content, not ads, though too many ads will scare folks off (or force use of an adblocker). If you are trying to rank your site by CTR you are living in a pipe dream. Ad CTR does not equal PR. Just one man's thoughts. No eye rolling involved.

CTR will increase if the Content Is King philosophy is entertained. (And that's Hard Work)
6:53 pm on Nov 8, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


>>Bounce rate is based on content, not ads

Meaning that if the ads are reasonable, it's good/bad content that drives bounce?
7:00 pm on Nov 8, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:9037
votes: 752


If the eyeballs stay on the page for the content, there's a good chance the ads just might be seen. Sorry, didn't mean to make such a dogmatic statement.

bounce is just that... and we don't always know why.
7:52 pm on Nov 8, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14871
votes: 478


...and we don't always know why.


We do know why. And it's not because content is king. Content is most definitely not king. Not everyone wants content. You think Facebook, Twitter, eBay, Spotify (yeah, really! :) ) , Macy's, ToysRUs are in the content business? You know what's really king? Scratch that. There are no kings. Just focus.

You know what explains the why? User experience. User experience explains the why.

User experience
I don't believe in kings or royalty. Rather be pragmatic. Focusing on user experience opens up a variety of opportunities for earning more money. Content is simply one of the ways to serve user experience on the way to making money. Focusing solely on content leads to a business myopia where it becomes difficult to understand how to monetize a site. User experience is integral to understanding what people want and how best to monetize that (or even if it's worth monetizing).
10:10 pm on Nov 8, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:9037
votes: 752


Okay, I take back the apology. :)

Someone more dogmatic than I. (no smile attached)

User experience is based on .... what? Clicking a button? Clicking an ad? Why would anyone do that if they weren't seeking some kind of content? Content can be anything, but not all content is for everyone, and Focus merely means the content is not general, but specific. Hence the terms "niche", "silo", "vertical" etc. Bounce merely means the user doesn't stay around very long... and unless there is eye-candy or mind-candy for the user to linger, no surprise there. Even less surprise if that "content" can be obtained, enjoyed, and satisfying in 10 seconds or less. Without content, there is no user experience.
5:41 pm on Nov 9, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


We do know why.


OK "user experience" explains behavior, based on bounce rate alone we have no idea if it's a good user experience or a bad one. Imagine two pages:

1. The query is [where to buy groceries in Podunk]. I have the most amazing, complete and perfect page on buying groceries in Podunk. Once people see my amazing page, they have no need to read further so they don't. Bounce rate is very high.

2. The query is [where to buy groceries in Podunk]. My page is a piece of crap, plastered with ads, minuscule fonts and and a terrible UX, but I've managed to get it to rank and I manage to get traffic there. Bounce rate is very high.

Based on bounce rate, what have I learned?

Now you can set up your analytics to only count a bounce if the user spends less than X seconds on the page. If you are tracking that way and you know that a bounce is someone who was there two seconds and gone, then yes, bounce tells us that something is wrong with the page. Without some sort of additional metric beyond pure bounce, you still know almost nothing.

Other metrics can help you understand it
- time on page
- scroll percentage
- long bounces vs short bounces
- goal tracking - good or bad user experience on that page, if the visitor isn't going where you hope she'll go, then you have a problem. But that could be true if she looks at 200 pages, just none of them being your conversion pages.

But if all you know is someone came to your site, viewed one page and left, you don't necessarily know why.

Now Google is another matter. They have a piece of data I don't and they could use this. They can see not just who bounces, but what people who bounce straight back to Google search on.

So in the example #1 I gave above, the first search will be [where to buy groceries in Podunk] and the second search will be [directions to Podunk General Store].

But in example #2 the first search will be [where to buy groceries in Podunk] and the second search will be [where to buy groceries in Podunk]

I could see Google using that data to identify bad pages. Not saying they do, but just saying that they have data points that allow them to use bounce rate in ways that the typical site owner can't
5:46 pm on Nov 9, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


PS, I've been spending a fair bit of time lately doing heatmapping (both clicks and scrolls), first-click testing, tree map testing and some live user testing.

These "user experience" signals are strong when the problems are big and obvious. They are incredibly noisy otherwise which I think that explains the reticence to see them play too large a role in ranking algorithms.
6:09 pm on Nov 9, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


Gary Illyes paraphrased by Jen Slegg (the original source for the article martinibuster quoted above). :
He does say they see those who are trying to induce noise into the clicks and for this reason they know using those types of clicks for ranking would not be good. In other words, CTR would be too easily manipulated for it to be used for ranking purposes. -- [thesempost.com...]


Talking heads and tapping fingers around the web

Rand Fishkin presents "inconsistent" test and, to be frank, what appears to be wishful thinking in his Whiteboard Friday asserting that CTR affects ranking (emphasis mine)

Maybethat the click-through rate is a signal to Google of, "Gosh, people are deeply interested in this. It's more interesting than the average result of that position. Let's move them up." This is something I've tested, that IMEC Labs have tested and seen results. At least when it's done with real searchers and enough of them to have an impact, you can kind of observe this. -- [moz.com...]


Not resounding proof. I'm not totally sure it's inconsistent with Gary's comments... I guess it would depend on time scales and how quickly that data feeds back in.

Bill Slawski reviews the patent evidence, of course and is ambivalent
[seobythesea.com...]

AJ Kohn has an extensive round up and is ambivalent as well, but thinks click data is being used in some way, which is again not inconsistent with Gary's comments. There's a lot there going back to comments from Marissa Mayer and random Google engineers who may or may not have any idea what they're talking about (I'll say likely do NOT - unless an engineer says he specifically worked on a given problem, I regard all such comments as gossip only).

Google uses click data as an implicit form of feedback to re-rank and improve search results.
--http://www.blindfiveyearold.com/is-click-through-rate-a-ranking-signal


On the search quality side, Google has said that they do use click data. So that would be similar to my example above and not exactly the same as ranking based on high CTR, but more like filtering based on lower CTR
Via Danny Sullivan: [twitter.com...]

Meanwhile, over on the bounce rate side, Matt Cutts back in 2012 distinguished between GA bounce and other forms and refused to comment on the other types

Danny: “I know you have said that Google does not use Google Analytics bounce rate. What about bounce back to SERP and behavior on the return to SERP?”
Matt: “GA bounce rate is a very noise signal. We never use it". (He spends about 5 minutes not answering the original question - the question was about bounce to SERP - not Google Analytics bounce rate).
Danny: What about bounce to the SERP?"
Matt: We do not use GA data, it very noisy signal.
Danny: No, I am talking about the SERP and bounce to SERP.
Matt: Well, ummmm…. we don’t like to rule certain things out for the future ….ummm…..ummm.
-- [afterpanda.com...]
6:17 pm on Nov 9, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14871
votes: 478


I could see Google using that data to identify bad pages.


Yes, exactly. :)

And even more to the point, to identify bad SERPs. To my understanding, the point of CTR click log mining is to help the machine learn the user intent and how the ranking or scoring engine may have failed. The click log data is a rich source of information that can be used to improve the algorithm in a scaled manner. Google's all about scale.

Why bother solving individual SERP problems when it's not the individual SERP that's the problem but rather the algorithm itself when confronted with specific kinds of queries? It's silly to solve one SERP problem at a time, to adjust the SERPs, one query at a time when the problem isn't just the one query but the underlying circumstances that produce a great many failed SERPs.

That would be like fixing a failing house one nail at a time, while building more houses using the same faulty nails, then fixing that house one nail at a time. Doesn't it make more sense to identify the underlying reason for the failure (faulty nails) than to go on fixing cracked sheetrock here and misaligned edges there, over and over?

The scenario ergophobe mentioned wherein the click log data mining reviews if the user returns to the SERPs to click on another link is a means to identify if the process that created that SERP is flawed and if it needs to be modified- particularly if it's repeatedly happening across a range of other similar queries. That's quality control.
7:41 pm on Nov 11, 2015 (gmt 0)

New User

joined:Mar 8, 2015
posts: 11
votes: 3


@martinibuster

Nearly, but not. The issue is much more complex.

It comes down to forced disambiguation by Google. I have said the only democratic route for search is the original dumb index with optimizations. If you have a query with intent and G serves up a page of results, that is fine. The user then (let's say) clicks through all ten results and there are differing times for return to SERPS. Then this scenario would judge the page quality FOR THAT SEARCHER. Now if the intention of the searcher was slightly askew to 10000 others asking mainly the same thing but different. Widget Blue Large Style vs Widget Blue - the results are going to change, but so might the intent - the searcher with the shorter query might be less sure of what they want. Once you get into the nuance of meaning of a language, IOW the search term used has a triple meaning - it's going to add another dimension of permutation and combinations.

The result of which is Google's need to create their own filter funnel to control the boundaries of endless answers.

Therefore, it's not quality control - it's Google control. The AI is the tail wagging the dog. Google has no choice but to work on confining and corralling a subset, rather than sampling the great outside world. The sites it will display fall into an ever smaller defined mould, rather than for virtue.
9:19 pm on Nov 11, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14871
votes: 478


I am aware of that scenario and so is the science of IR. The classic question cited in research is, what do users mean when they type the word, Jaguar? The car, the sports team or the animal? Multiple meanings have been studied since longer than ten years ago. That's been done and understood.

Then this scenario would judge the page quality FOR THAT SEARCHER.


The majority wins. When there are multiple meanings they show all of them in the SERPs. Type Jaguar into the SERPs.

I'm not saying Google is perfect. Far from it. What I AM saying is that CTR and other user behavior metrics are not used in real-time to update the SERPs. They are used in the manner I've already described. I am aware of a recent article on this topic and they cite a Google patent. However the Google patent they cite does not support the opinion about that patent. In fact, several paragraphs further down from what is quoted, the patent mentions that the process for data mining CTR is used to update a FUTURE SERP, not in real time.
9:36 pm on Nov 11, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


not used in real-time to update the SERPs. They are used in the manner I've already described.


At the end of the day, though, what's the difference if I'm thinking about how my site will rank one year from now?
2:10 pm on Nov 15, 2015 (gmt 0)

Junior Member from US 

10+ Year Member

joined:Oct 31, 2005
posts: 196
votes: 11


Martinibuster: can you explain to me why the supposed 'fix' for panda-affected sites is to cull low quality, 'bad' pages, i.e. pages that exhibit poor user metrics? This is something that Glenn Gabe (and others) have suggested doing to 'get out of panda'.

They specifically say: find pages on your site that have poor user metrics, get rid of them by noindexing/301ing/404ing. Then, with the remaining 'good' user behavior pages, you'll slowly see traffic increase because your panda demotion was lifted.

If user behavior (CTR, bounce rate, tos, page views etc.) isn't used in rankings, why is this advice so widely given out and accepted?

Are you saying that Panda is not at all about user experience? Will causing your site visitors to stay on your site longer not help your rankings, or not positively affect your 'Panda score'?

I just want to get a handle on this because my strategy has been exactly what I outlined above, but it has been to no avail, so it seems I am not fully understanding things.
3:12 pm on Nov 15, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14871
votes: 478


Martinibuster: can you explain to me why the supposed 'fix' for panda-affected sites is to cull low quality, 'bad' pages, i.e. pages that exhibit poor user metrics?


How about you tell me. Answer this:

Do you have knowledge of the many kinds of signals Panda would be looking for?

Are you aware of all the different kinds of things CTR can mean? (hint: A short dwell time can mean user satisfaction.)

Do the people you follow cite research or patents that show that Panda is explicitly tied to CTR performance?

Even if Panda was related to CTR, how would an SEO know the CTR performance of a web page since that information is hidden in Google's click log?

CTR data is only useful when it's combined with the activity of the user AFTER they leave a web page. That activity may be an indication that the algorithm itself, not the web page, is the problem. The user may have typed an ambiguous query. So the user returns and types a more specific query. In that instance, and this happens a lot, there's nothing wrong with the page with the high bounce rate, the problem was the algorithm for not being a mind reader. So if that page publisher sees the high bounce rate, do you think it's good for that publisher to remove the page?

What does a high bounce rate really mean? It means a lot of things, some of which can be fixed. Some of it, like crap cookie cutter content can't be fixed.

But the engines don't need CTR to identify cookie cutter content. There are other ways to find that. In October at an SEO meetup in Boston I mentioned in passing how search engines find that and only one person in the room shook their head to acknowledge they'd heard of this technique. The funny thing is that the search engines have been applying this technique for over ten years and the SEO community doesn't even know about it. It's not in any SEO diagnostic tool.

Beyond that, it is well known that most signals are noisy. That's why they work best when compared with other signals. So relying on a single signal is dubious.
8:17 pm on Nov 15, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 6, 2006
posts:1191
votes: 41



But the engines don't need CTR to identify cookie cutter content. There are other ways to find that. In October at an SEO meetup in Boston I mentioned in passing how search engines find that and only one person in the room shook their head to acknowledge they'd heard of this


Care to expand on what that method is?
2:00 pm on Nov 16, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 16, 2009
posts:1082
votes: 79


not used in real-time to update the SERPs. They are used in the manner I've already described.

At the end of the day, though, what's the difference if I'm thinking about how my site will rank one year from now?

None, assuming that:
(a) first, Google correctly divines what it is that people like about pages that get more both more click-throughs AND give user satisfaction;
(b) next, Google is able to divine a way to measure and score these factors without using click data within the algorithm;
(c) finally, that an algorithm using these factors works across all types of searches.

I'm not a search engineer, but to me that seems like a lot more work, with a lot more things that could go wrong, than detecting and filtering bots and click farms.

Having said that, if they can do it, it IS a fairer system, because it would allow for new arrivals to break their way into a SERP if they possess the same qualities as the sites already ranking.
5:24 pm on Nov 16, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


None, assuming that:


Actually, all these assumptions are the same whether the data is used to teach the algo or to affect search results in real time. The only difference, ultimately, is time horizon.

Martinibuster.... thoughts?

>>lot more work

We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win, and the others, too.


>> a lot more things that could go wrong

I think we agree with each other and martinibuster there. These are noisy signals. Is Google good enough at sorting the signal from the noise? There's still more signal than noise surrounding that question :-)
5:35 pm on Nov 16, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 16, 2009
posts:1082
votes: 79


all these assumptions are the same whether the data is used to teach the algo or to affect search results in real time

I can't agree.

Using click data in real time, to my mind, means you allow the actual user interaction to determine the sites that rise or fall. There is no 'learning' beyond things like:
- what results get clicked; and
- whether someone returns to the SERP or not.

These are the two most basic things I can think of but also the most important.

Using that data to then (a) give you a seed set of sites that you then (b) analyse using other methods to (c) develop a different scoring method that aims to reward those sites in other ways is completely different, surely - or am I missing something?
12:50 am on Nov 17, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8627
votes: 274


No, I don't think you're missing something. I think you're right about that.

I would say I meant my comment on a more general level - if the signal is irretrievably noisy, you can't do anything. But I think you're right that real-time ranking feedback would require a much cleaner signal than using that data to gain insights that go back into refining your algo
12:38 am on Nov 22, 2015 (gmt 0)

Junior Member

joined:Jan 13, 2014
posts:115
votes: 23


UI data I think is over emphasised and very much depends on the search term itself. For instance if I search "widgets" then the time spent on site could be an indicator of the quality of the site, however if I search "brandname widgets" my time on the site would reasonably be expected to be longer as I had already chosen the website and was searching google only for results of that website. But here the thing, google have pretty much said they are using "brandname widgets" UI to determine the results for "widgets". This I believe is stiffing competition and leaving the results stagnant. Quite simply they are measuring the wrong UI signals in my opinion.
3:53 am on Nov 22, 2015 (gmt 0)

Junior Member from AU 

5+ Year Member Top Contributors Of The Month

joined:Oct 28, 2012
posts: 98
votes: 30


Like MartiniBuster says .. CTR alone is way too noisy as a signal of quality or relevance - there are many, many research papers by Google, Bing and Yahoo engineers on this topic. Using CTR and dwell time in combination has been proven (in those papers) to be far more accurate in judging both of these things. The research papers mention this data being used to train algorithms.

Do Google use CTR+Dwell Time in their algorithm - they deny using it for ranking directly but training their algorithm on what good websites "behave" like is most likely. Will improving CTR+Dwell time hurt? - no, these are great elements to focus on to improve user engagement with your website, and increase sales!

Not sure why there is so much angst over something which seems so obvious .. work to improve your CTR+dwell time anyway, and does it really matter if they do or don't if you improve user engagement, sales, tos, etc. But .. If it helps you rank better .. even better.
4:50 am on Nov 23, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2006
posts:1574
votes: 119


It's pretty well accepted that how the user interacts with your page will impact how google will rank your page in the future.

"Interacts" is a broad term, or at least not the best in this case IMHO, there is no way for G or other SE's to know how people interact with your site except from "click on back button" and return to results, something more difficult this days where people open multiple tabs. The other way could be via JS code and clicks analysis on GA or others, several stat tools can "map" or so they say where the users go after entering your website. While this is possible, I don't think it's heavy or means something for ranking at this point.

Lots of bounces could lead to lower rank.

There are many factors but I think it's valid what you say.

If I have one page on a site that receives a lot of traffic, but nearly all traffic bounces. Does that have potential the impact the entire domain, or just that one page?

This part is very important. Traffic means visitors but X page could be an entry page, usually you don't want that to cause bounces, but rather: people staying. You could also have traffic on that page via internal clicks, not as entry page. Anyway going back in time, here on WebmasterWorld there were good threads about ads being exit points, it's not bouncing but also means loosing that visitor going somewhere else. USUALLY this matters in diff ways depending on your objectives, and nope, wanting as many visitors to click on ads it's not ok if that means they just go away, buying or not buying it's another discussion.

I've seen more and more people wanting others to click on ads, even like "demanding" that to be the case, being more conservative (I guess) recalling older threads and my own experience, it's not ok to push that way that much, it matters to keep people in.

The page is about how to tell a real widget Vs fake widgets. By it's nature people will search, get their answer and leave. We sell widgets. But this traffic is shoppers, they have a widget in hand and want to find out if it's fake or not. Should I just get rid of the page?

That depends more on your goals and strategy, it's more like a personal opinion.
This 35 message thread spans 2 pages: 35
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members