homepage Welcome to WebmasterWorld Guest from 23.22.29.137
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Approximate click thrus from Google SERPs?
Trying to project visitors to site based on ranking results
erica2588




msg:76323
 1:05 am on Dec 12, 2003 (gmt 0)

Is there any VERY ROUGH guideline as to the approximate click thru rates from page 1, page 2, and page 3 of Google results? I know that it would vary hugely based on so many factors like keyword, industry, overall relevance of results to user's query, etc. I've been digging around on here for a while, and run a few other queries elsewhere but haven't found anything.

Just looking for rules-of-thumb as to click rates from results pages (regardless of industry, relevance, etc.) if they even exist. Thanks for any info or insight!

 

AjiNIMC




msg:76324
 2:08 am on Dec 12, 2003 (gmt 0)

Hi erica2588,

Noone is 100% sure but all can guess what might be there, as far I think, there is obvious some weightage for the click through the SERPs,

Say you are on number 1 page of SERPs you will have less weightage for the click compare to the one in second page. Also what matters is how long does a user stay on the page, this will also help to guess the usefulness.

But unfortunetly no one measure, the sure tricks are title, keyword density, anchor text e.t.c

Aji

rfgdxm1




msg:76325
 2:14 am on Dec 12, 2003 (gmt 0)

My guess is that most people never go to page 2 unless they find page 1 SERPs quite unsatisfactory. Other exception may be price shoppers. They'll check some of the sites listed beyond page 1 to see if they have lower prices.

hobbnet




msg:76326
 2:20 am on Dec 12, 2003 (gmt 0)

Many ppc engines have some sort of statement about CTRs and listing position on a page.

I remember reading the first position gets 3 times as many clicks as the second (dont know if this is too accurate). But if this IS the case, imagine the difference in click volume when you go from page one to page two.

apfinlaw




msg:76327
 2:48 am on Dec 12, 2003 (gmt 0)

There is a rule of 50%, wherein the next page has 50% of the visitors of the previous.
e.g. Page 1 100% Viewing
Page 2 50% Viewing
Page 3 25% Viewing
12.5%, 6.25%,
3.125% clickthru to page 6

Again, this is just a rough estimate

erica2588




msg:76328
 7:20 pm on Dec 12, 2003 (gmt 0)

Thanks all for the insights!

AjiNIMC




msg:76329
 7:44 pm on Dec 12, 2003 (gmt 0)

Hi erica2588,

Welcome to the world where people are trying to find certainty in uncertainty

Aji

hobbnet




msg:76330
 7:44 pm on Dec 12, 2003 (gmt 0)

e.g. Page 1 100% Viewing
Page 2 50% Viewing

I would be really suprised if this were true but would be very happy if it were proved/confirmed.

Bompa




msg:76331
 8:19 pm on Dec 12, 2003 (gmt 0)

By running an AdWords campaign with a very high daily budget and click thru bid, I kept my ad on the first page for my keywords "mail order brides" for one week. I then dropped my bid drastically, which caused my ad to appear on the second page of results for a week.

Because my daily budget was high, my ad was displayed with every search.

My ad averaged 2100 impressions/day, (searches), on the first page and 1300 impressions/day while on the second page.

So, during this particular test periord and for *my* keywords, two thirds of the searchers, (for whatever reason), did not go beyond the first page.

I offer this not as proof, but only as evidence to the common marketing statement "60-70% of searchers find what they are looking for on the first page of results".

Bompa

eraldemukian




msg:76332
 8:41 pm on Dec 12, 2003 (gmt 0)

This is an interesting question.
I would guess it would greatly depend, on how much somebody is interested in something. If I need to fix something, and I am confident about the fact that its out there somewhere, I look at more serps, compared to when I am casually looking something up.

I did ask my computer, and he came back to me after 19 minutes and said the answer would be forty-two.

Seriously: Its an interesting question. I ran an quick and dirty analysis on a log file of mine. I parsed the referrer against www.google.com, and was looking for the start=? and num=? tags. When there is no num tag, then I assume a serp-size of 10. I assume (falsely) that the referring link is in the middle of page. A link from the first page (going to 10) would
result as a position 10. Which is probably not true. My site is non competitive. Lost of 'left-over-traffic'. That also may
have an effect on the data.

Here the data, based on the last 505559 referer's that matched www.google.com:

resulting page max number, Percentage:

10 76.7
20 11.64
30 4.58
40 2.15
50 1.16
60 0.63
70 0.39
80 0.56
90 0.24
100 0.3
110 0.13
120 0.09
130 0.08
140 0.06
150 0.65
160 0.04
170 0.04
180 0.04
190 0.05
200 0.03
210 0.02
220 0.02
230 0.02
240 0.01
250 0.02
260 0.01
270 0.01
280 0.01
290 0.01

As an aside:
I also noted how many people have changed their serp result size. Not surprising the bulk runs with the default of 10.
Google takes more time and therefor CPU time and therefor operating costs the deeper the search goes. I think
that google would save thousands of dollars when they would change the default number of searches returned from 10 to 8.
Roughly interpolating the curve that we see here, it looks that only very very few people hit on result 9 and 10.
Nobody would miss it, and one nuclear power plant could go offline. GoogleGuy: please send me the check, or maybe just +2 my PR. Hey maybe just give me the AdSense money google kept, after I got kicked out of the program.

Size of result page:
Page-num Percentage:
10 98.82
20 0.18
30 0.13
50 0.23
100 0.6

Here the perl code (UPPERCASE YOU MIGHT NEED TO EDIT)

#!/PATH_TO_YOUR_PERL

open (IN, "PATH_TO_YOUR_APPACHE_LOG") or die "unable to open logfile";

while (my $l = <IN>){
$l =~ /([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+) - - \[(.*) -0[0-9]00\] \"([^"]*)\" ([0-9]+) ([\-0-9]+) \"([^"]*)\" \"([^"]*)\"/;
my $address = $1;
my $time = $2;
my $request = $3;
my $state = $4;
my $size = $5;
my $referer = $6;
my $agent = $7;
my $out = 0;
if ($request =~ /^HEAD /){
next;
}
$request =~ s/^GET //;
$request =~ s/HTTP\/1\.[01]//;
#print "$referer\n";
if ( $referer =~ /www\.google\.com/){
$referer =~ /num=([0-9]+)\&/;
$num = $1;
if (! $num){
$num = 10;
}
$referer =~ /start=([0-9]+)\&/;
$start = $1 + ($num / 2) ;
#print "n: $num s: $start $referer\n";
$scoll{$start} ++;
$ncoll{$num} ++;
$tnum ++;
}
}

print "start: $tnum\n";
$step = 10;
$curstep = 0;
$cura = 0;
foreach $s ( sort numerically keys %scoll){
if ($s > $curstep){
$perz = int ($cura / $tnum * 10000) ;
$perz = $perz / 100;
print "$curstep $perz\n";
$cura = 0;
$curstep += $step;
}
#print "$s $scoll{$s}\n";
$cura += $scoll{$s};
}

print "size of returned pages\n";
foreach $s ( sort numerically keys %ncoll){
$x = $ncoll{$s};
$perz = int ($x / $tnum * 10000) ;
$perz = $perz / 100;
print "$s $perz\n";
}

sub numerically{
$a <=> $b;
}

#edited, and then not tested, hope it still works ...

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved