Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Will more clicks get me better SERPs?

         

Tearabite

11:40 pm on Mar 16, 2006 (gmt 0)

10+ Year Member



For some reason i've been getting a lot of traffic from a few particular keyword combinations, even though i'm listed between 20 and 40 in the SERP for those search phrases.. this is a VERY competitive area, and i'm surprised i'm getting anything at all.

if lots of people click on MY link in the search result, will this tell Google that MY site must be good, and increase my spot in the results?

Stefan

3:28 am on Mar 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



if lots of people click on MY link in the search result, will this tell Google that MY site must be good, and increase my spot in the results?

I've seen it suggested in the past that users with the G toolbar installed, who click through to your site, might influence G, but it's pure speculation. Other than the toolbar feedback, it shouldn't have any affect (as far as I know, etc, and if others know differently, please correct me).

tedster

4:05 am on Mar 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There certainly is conjecture (without official confirmation) around this topic of traffic's effect on the SERPs. Some feel that traffic is one of the factors that will help end the sandbox effect on a new site. And the toolbar would be only one potential source of data. Google is involved in a lot of Wi-Fi service for example, and some kinds of traffic data can also be purchased -- or even seen directly on Alexa, for instance.

And the obvious, as mentioned in the opening post, they certainly do measure clicks on serch results from time to time. They need to to measure user satisfaction with their search results -- but then they have this data lying around. I'm sure it can be mined for more than one purpose.

[edited by: tedster at 4:15 am (utc) on Mar. 17, 2006]

minnapple

4:14 am on Mar 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



deja vu,

We used to discuss this same topic 5+ years ago.

Obviously this was before Google's tool bar.

This does open up a very interesting question.

What data does the tool bar feed into Google's database,

how do they use it today,

and how may they use it later?

otech

5:44 am on Mar 17, 2006 (gmt 0)

10+ Year Member



You only need to use sitemaps to know they definately record how many times you are shown in SERPS vs Keyword, and how many Clicks VS being shown per keyword - If they capture such information, store such information etc - its most likely in my opinion they use it too.

Just like every part in a car has its purpose, I would beleive every piece of data they collect gets blended into the rankings - of course how much is just speculation ;-)

PhraSEOlogy

5:59 am on Mar 17, 2006 (gmt 0)

10+ Year Member



If this is true (improve rank by click thru) then what is to stop me using anonymous proxy servers and click on my own links all day long?

Tearabite

6:40 am on Mar 17, 2006 (gmt 0)

10+ Year Member



what might stop you is the amount of time it would take to connect to each one of those proxies, google, click, move on the the next, and start over..

forzatio

7:04 am on Mar 17, 2006 (gmt 0)

10+ Year Member



I asked this question on another forum some time ago, at the end most thought that the serps would never change if they would get ranked higher because of their clicks.

PhraSEOlogy

7:08 am on Mar 17, 2006 (gmt 0)

10+ Year Member



what might stop you is the amount of time it would take to connect to each one of those proxies

What if I use a perl script to grab a list of available proxies and pass them to a bot to do the dirty work?

tedster

7:14 am on Mar 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



With the technology Google has in place to detect fraudulent Adwords clicks, I'll bet they can filter out the bulk of automated or fraudulent SERP clicks too.

PhraSEOlogy

7:16 am on Mar 17, 2006 (gmt 0)

10+ Year Member



Damn, all that hard work gone to waste!

<added>Perhaps their technology is not that good - Google Agrees To Pay $90 Million In Click Fraud Suit</added>

ronburk

7:54 am on Mar 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



will this tell Google that MY site must be good

I personally believe this is true in my experience, but I cannot figure out an experiment that will eliminate enough confounding factors to prove it. Google just has too many algorithm factors that I cannot control or eliminate.

  • I believe there's a non-trivial time lag between the start of people "reaching down" in the SERPs and getting the rewarding boost. More than a few days.
  • I suspect it's not just users "reaching down", but may also rely on the surrounding behavior. For example, I suspect that someone who clicks on one or two SERPs above me, is immediately dissatisfied, and only eventually locates me (and then doesn't go back and search further for the same term) is best for me.
  • People mention Google getting info from the toolbar, but of course they always can see whether the same IP address came back to search for the same term during a short time period, and which page(s) of the SERPs they were looking at. AFAIK, they also still periodically sample actual click info by inserting JavaScript intercepts.
  • When someone is complaining that website X is beating them in the SERPs despite being newer, lower PR, etc., I always take a hard look at the interloper's SERP listing to see if it's possible that it's way more compelling for some non-trivial subset of searchers.
  • I tend to view taking advantage of this effect as much the same game as writing good AdWords copy so that you pay less per-click than your competitors. When I'm trying to get a term from page X to page 1, I study what people are actually probably searching for, study what they're being offered on page 1, and look for what I can provide in my SERPs listing that isn't being offered already. Then it's down to basic SEO tweaks to convince Googlebot to show the text I want in the SERPs listing.
  • Keep in mind that as the number of websites competing for a term grows, Google still has the goal of making sure you find what you want on page 1. That means, it's in their interest to vary the semantic content of the page 1 SERPs. For example, are you trying to get on page 1 for "widgets"? If page 1 is dominated by "red widgets", "blue widgets", and "yellow widgets", then I look for the semantically related topic that people are likely to look for that isn't already on page 1 (or is on page 1, but coming from a page with easily beatable other SEO factors). Maybe it's "widget repair" or "widget restoration" or "widget swap meet" or "widget magazine" or "widget tutorial". It's in Google's best interests to get semantically varied listings on page 1 in many situations, to increase the odds that people searching for "widgets" for two different reasons will still have a good shot of both finding what they want on page 1.

Of course, I could be totally fooling myself and wasting my time :-).

ronburk

8:04 am on Mar 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What if I use a perl script to grab a list of available proxies and pass them to a bot to do the dirty work?

Probably doable. Just make sure you sample actual user searching behavior, emulate it perfectly (with some random variation), and that your TCP/IP packet signatures are identical in every detail (bitwise construct and timing) with one of the major browsers.

And, you might need to know what the existing search patterns for your search term are already, so you only introduce gradual changes and don't trip any "hey, look at how all the sudden way more searches for X are going on than last week, despite no concommitant increase in closely related terms" alarms.

All Google has on their side is a team of PhDs and hacker/programmers and all the data -- you just have to keep up with them.

tedster

8:05 am on Mar 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's in Google's best interests to get semantically varied listings on page 1

Thanks for that insight - I really like it a lot.

PhraSEOlogy

8:16 am on Mar 17, 2006 (gmt 0)

10+ Year Member



ronburk,

You make some good points there. Ah well, back to the drawing board.

alphacooler

5:09 pm on Mar 17, 2006 (gmt 0)

10+ Year Member



The question is why wouldn't they use this information? It is incredibly valuable information to discern the quality and relevancy of sites.

ZoltanTheBold

5:19 pm on Mar 17, 2006 (gmt 0)

10+ Year Member



It seems sensible for them to use any and all data they can get their hands on, including click through rates, click through behaviors, use of Google stats etc etc

As ronburk pointed out the real deal is probably searcher behavior, of which click through rate is only one factor. I have no doubt there are algorithms that try to analyse what a user is doing (click number 1, then come back, click your listing, not come back etc).

Therefore perhaps click through rate is important in the same way keywords are; it's not enough just to have them, you have to use them wisely and context affects their weighting and importance.