Welcome to WebmasterWorld Guest from 54.197.94.30

Google's Reasonable Surfer Model

   
5:22 pm on Oct 25, 2010 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I was just reading through some information on this model and thanks to Tedster for sharing this neat article on it - [seobythesea.com...]

Though I have my own doubts on whether it is a good model to combat paid links, it seems to be a fairly good model in assigning weights to links. Under this model, no two links on a page will get the same share of link juice. The share of link juice that a link can attract will be influenced by various factors, some of which have been well explained by Bill Slawski.

Reasonable surfer model is Google's new way of seeing links. It is a patent filed by google in 2004 and they probably have implemented it already.

What are your thoughts on this model? Do you think that it is already in force or did google implement it in full in may 2010, as it was then when we saw a lot of noises about lost traffic here.
1:52 am on Oct 26, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I agree with you that the core purpose of the Reasonable Surfer model is not to combat paid links. However, I think it details a much finer number of distinctions between types of links that the simple mental model many people use (nav - content - sidebar - footer).

For example, note that the patent talks about weighting all internal links differently than external links. It also differentiates links to hostnames (subdomains) from both internal and external links:

Examples of features associated with a target document might include...

3. Whether the URL of the target document is on the same host as the URL of the source document;
4. Whether the URL of the target document is associated with the same domain as the URL of the source document;
3:03 am on Oct 26, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been working on my sites for more than one year ... it was on-site only and I've been reading this patent. I did a lot of changes considering this pates but it did not help at all.

Get backlinks, backlinks and again backlinks and you will see positive results faster than on-site optimization.
3:03 am on Oct 26, 2010 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



3. Whether the URL of the target document is on the same host as the URL of the source document;


I thought that it meant link differentiation based on ip address on which the target document is hosted irrespective of whether it is in the same domain (or subdomain) or a different domain.

Larget sites on the web have multiple ip addresses and i thought google wants to differentiate them.But i am not sure whether it gives them extra points or google treats them on par with others...
3:08 am on Oct 26, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There are c-class hosting services for seos. I think people did look into this patent :)
3:31 am on Oct 26, 2010 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Oh yes... Round Robin DNS Load Balancing, c-clsss hosting and all other gimmicks :)
6:15 am on Oct 26, 2010 (gmt 0)



Well in my opinion although it has applications way beyond paid links, I'm sure that was a driving factor behind it. It seems like a much more flexible way of dealing with it than a straight penalty or discounting of the link all together.

I'm sure as much as Google would love every major site with paid advertising to use the nofollow tag they pioneered that will never 100% be the case, am I completely misunderstanding this or could "Paid Linkyness" be one of those deciding factors that reduced the value of just those links on a page?

I'm sure Google could muster an argument about users not wanting to click on irrelevant paid links
 

Featured Threads

Hot Threads This Week

Hot Threads This Month