homepage Welcome to WebmasterWorld Guest from 54.145.182.50
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 189 message thread spans 7 pages: 189 ( [1] 2 3 4 5 6 7 > >     
New Google Patent Details Many Google Techniques
msgraph

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 3:47 pm on Mar 31, 2005 (gmt 0)

Probably one of the best bits of information released by them in a patent.

Large number of inventors listed on here, even Matt Cutts that guy that attends those SE conferences. Explains a bit about what is already known through experience as well as comments made by search engine representatives.

Example:


[0039] Consider the example of a document with an inception date of yesterday that is referenced by 10 back links. This document may be scored higher by search engine 125 than a document with an inception date of 10 years ago that is referenced by 100 back links because the rate of link growth for the former is relatively higher than the latter. While a spiky rate of growth in the number of back links may be a factor used by search engine 125 to score documents, it may also signal an attempt to spam search engine 125. Accordingly, in this situation, search engine 125 may actually lower the score of a document(s) to reduce the effect of spamming.

USPTO version [appft1.uspto.gov]

< Note: the USPTO has at times either moved or removed this
patent. If that happens again, here's an online back-up copy:
Information retrieval based on historical data [webmasterwoman.com]>

[edited by: tedster at 3:04 am (utc) on April 10, 2008]

 

airpal

10+ Year Member



 
Msg#: 28814 posted 5:09 pm on Mar 31, 2005 (gmt 0)

No wonder Google seemed stale for about a year, some of the stupidest assumptions in this document were actually acted upon and implemented!
This is a perfect example of making nothing out of something...

"Certain signals may be used to distinguish between illegitimate and legitimate domains. For example, domains can be renewed up to a period of 10 years. Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith."

Oh yeah, I've also never seen the words "spammy, illegitimate, and legitimate" used more times in a single document.

[edited by: airpal at 5:23 pm (utc) on Mar. 31, 2005]

madmatt69

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 5:11 pm on Mar 31, 2005 (gmt 0)

Yeah - those are some crazy assumptions. I always register one year at a time, just because I find it saves money. Now I wonder if there's anyway that we can actually do some tests to see if they are weighting sites based on some of these factors?

mrMister

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28814 posted 5:13 pm on Mar 31, 2005 (gmt 0)

For example, domains can be renewed up to a period of 10 years. Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith.

LOL, this is going to spark mass hysteria amongst search engine optimisers.

Please excuse me while I run off to buy a load of shares in domain registration companies.

Green2K

10+ Year Member



 
Msg#: 28814 posted 5:16 pm on Mar 31, 2005 (gmt 0)

This all ties into one of the most important comments GoogleGuy ever made:

"signals of quality"

[webmasterworld.com...]

"We've definitely been working to incorporate new signals of quality and improve the way that we rank pages"....

michael heraghty

10+ Year Member



 
Msg#: 28814 posted 5:38 pm on Mar 31, 2005 (gmt 0)

Confirms what had been rumours ever since Florida...

However, not just age of links but age of just about everything you can think of, are factors that now carry weights in the algorithm:

freshness of a link associated with the document is based on at least one of a date of appearance of the link, a date of a change to the link, a date of appearance of anchor text associated with the link, a date of a change to anchor text associated with the link, a date of appearance of a linking document containing the link, and a date of a change to a linking document containing the link

Itagnc

10+ Year Member



 
Msg#: 28814 posted 5:54 pm on Mar 31, 2005 (gmt 0)

It appears to me that this partially validates the sandbox theory...

rocco

10+ Year Member



 
Msg#: 28814 posted 6:07 pm on Mar 31, 2005 (gmt 0)

great find!

however, it seems it mainly is a list of everything which *COULD* be implemented one day and it has to be saved.

the huge list also might cover the few really important elements, so seo gets dis-focused.

cabbie

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 6:08 pm on Mar 31, 2005 (gmt 0)

thanks for that Msgraph .
I haven't grasped it by any means but I think I better go update all my sites.

fischermx

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28814 posted 6:16 pm on Mar 31, 2005 (gmt 0)


Yeah - those are some crazy assumptions. I always register one year at a time, just because I find it saves money. Now I wonder if there's anyway that we can actually do some tests to see if they are weighting sites based on some of these factors?

Jeeez! I could say that may be true.
I own a domain since 7 years ago, paying 2-3 years in advance, it has no backlinks, all it has in the only page it has is a gif, and a paragraph.
It even has PR1!
How can one explain that?

fischermx

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28814 posted 6:28 pm on Mar 31, 2005 (gmt 0)


16. The method of claim 15, wherein the scoring the document includes assigning a higher score to the document when the document is selected more often than other documents in the set of search results over a time period.

How can this be done?
Is google going to implement 302 exit link counters?

Green2K

10+ Year Member



 
Msg#: 28814 posted 6:33 pm on Mar 31, 2005 (gmt 0)

This is superb for fuelling everyone's paranoia. Ranking based on a site's outbound affiliate links? Potentially!

Here:

[0090] Additionally, or alternatively, search engine 125 may monitor time-varying characteristics relating to "advertising traffic" for a particular document. For example, search engine 125 may monitor one or a combination of the following factors: (1) the extent to and rate at which advertisements are presented or updated by a given document over time; (2) the quality of the advertisers (e.g., a document whose advertisements refer/link to documents known to search engine 125 over time to have relatively high traffic and trust, such as amazon.com, may be given relatively more weight than those documents whose advertisements refer to low traffic/untrustworthy documents, such as a pornographic site); and (3) the extent to which the advertisements generate user traffic to the documents to which they relate (e.g., their click-through rate). Search engine 125 may use these time-varying characteristics relating to advertising traffic to score the document.

graywolf

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 6:34 pm on Mar 31, 2005 (gmt 0)

[0077] The dates that links appear can also be used to detect "spam," where owners of documents or their colleagues create links to their own document for the purpose of boosting the score assigned by a search engine. A typical, "legitimate" document attracts back links slowly. A large spike in the quantity of back links may signal a topical phenomenon (e.g., the CDC web site may develop many links quickly after an outbreak, such as SARS), or signal attempts to spam a search engine (to obtain a higher ranking and, thus, better placement in search results) by exchanging links, purchasing links, or gaining links from documents without editorial discretion on making links. Examples of documents that give links without editorial discretion include guest books, referrer logs, and "free for all" pages that let anyone add a link to a document.

ddogg

10+ Year Member



 
Msg#: 28814 posted 6:38 pm on Mar 31, 2005 (gmt 0)

This document confirms they are using the toolbar to rank pages. They are also using AdWords data, and CTR of organic listings to score documents. Interesting.

I hope they haven't implemented most of this stuff yet, because if they have they did a poor job.

WebGuerrilla

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 6:51 pm on Mar 31, 2005 (gmt 0)

63. The method of claim 62, wherein adjusting the ranking includes penalizing the ranking if the link churn is above a threshold.

Game on.

zjacob



 
Msg#: 28814 posted 6:53 pm on Mar 31, 2005 (gmt 0)

I read some time ago that G tries in the range of 12 new scoring criteria each month for search results and I guess this paper gives an indication on what type of things they are testing.

When the SERPS change dramatically next time/there's an update, it's time to go back to this paper and look for possible reasons.

One of the interesting things in the paper is that the methodology does not exclude "evergreen" content from scoring high in SERPs:

"20. The method of claim 19, wherein the scoring the document includes: determining whether stale documents are considered favorable for a search query when the document is determined to be stale, and scoring the document based, at least in part, on whether stale documents are considered favorable for the search query when the document is determined to be stale. "

MikeNoLastName

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 7:08 pm on Mar 31, 2005 (gmt 0)

>"and (3) the extent to which the advertisements generate user traffic to the documents to which they relate (e.g., their click-through rate). "

so therefore:
high Adsense CTR = high SERPs ;)

mayor

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 7:13 pm on Mar 31, 2005 (gmt 0)

Based on my own process control experience, there quickly becomes a point of limiting return as the number of control variables increase. I have seen attempts at incorporating too many control variables yeild unstable or unresponsive process controls that defy both control and analysis.

fischermx

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28814 posted 7:16 pm on Mar 31, 2005 (gmt 0)

Don't worry, the guys have PHds, they <most> know what they're doing, otherwise, would ask money back to their schools. :)

carguy84

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28814 posted 7:21 pm on Mar 31, 2005 (gmt 0)

How can this be done?
Is google going to implement 302 exit link counters?

You can do it with javascript, you don't need an intermediary redirect page to log it.

designhaus

10+ Year Member



 
Msg#: 28814 posted 5:32 pm on Mar 31, 2005 (gmt 0)

great find and post msgraph. this patent confirms some suspicions i have had for months.

chicagohh

10+ Year Member



 
Msg#: 28814 posted 7:31 pm on Mar 31, 2005 (gmt 0)

62. The method of claim 61, wherein the indication of link churn is computed as a function of an extent to which one or more links provided by the linking document change over time.

63. The method of claim 62, wherein adjusting the ranking includes penalizing the ranking if the link churn is above a threshold.

You've got to be kidding...

Atticus



 
Msg#: 28814 posted 7:34 pm on Mar 31, 2005 (gmt 0)

I have always thought of SEO as simply signaling the subject of pages through accurate titles, text, anchor text, etc. (And scored top 10 more often than not).

G has said both implicitly and explicity that trying to game the system will get you in trouble.

Now we see that they have considered using everything from length of domain registration to your star sign in determining placement in the SERPs. It seems that one would have to game the system to survive in this environment.

Did anyone ever tell G that if it looks like a duck and quacks like a duck there's a real good chance that it's a duck even if it doesn't sport a trusty "Shop at Amazon" banner on it's backside?

On a more serious note, I manage some domains for local clients and usually renew the domain names on an annual basis, bill the client and repeat as necessary. That's just the most logical and reasonable way for me to do it. These are real brick and mortar businesses and I see no reason why annually renewing their domains should make these domains look 'shady.'

mrMister

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28814 posted 7:42 pm on Mar 31, 2005 (gmt 0)

How can this be done?
Is google going to implement 302 exit link counters?

They've been doing this for a long time. On some data centres (that's all they need, a representative sample), Google results links are passed through a redirect script.

Spine

10+ Year Member



 
Msg#: 28814 posted 7:48 pm on Mar 31, 2005 (gmt 0)

Wow! Those PHDs sure are clever!

That must be why 'britney spears nude' pages are all over the first page for many terms that have nothing to do with her or nudity.

What part of the patent reads "crappy computer generated networks of spam are as good as authority sites, and their owners deserve to be rich"?

Such genius, such talent.

Atticus



 
Msg#: 28814 posted 7:53 pm on Mar 31, 2005 (gmt 0)

Too many cooks.

Corporate Google is killing the goose that laid the golden eggs.

Maybe G should ditch the PhDs in favor of kindergarten teachers.

fischermx

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 28814 posted 7:53 pm on Mar 31, 2005 (gmt 0)


You can do it with javascript, you don't need an intermediary redirect page to log it.

Oops, what a stupid, yes of course, I forgot, sorry.

But, are they tracking the clicks on SERPS already?

walkman



 
Msg#: 28814 posted 7:55 pm on Mar 31, 2005 (gmt 0)

Buy sitewide links for your competitors. Ready, set go!

Atticus



 
Msg#: 28814 posted 8:01 pm on Mar 31, 2005 (gmt 0)

walkman,

Excellent point. Another hole in the "can't be hurt by competitors" claim.

Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 28814 posted 8:03 pm on Mar 31, 2005 (gmt 0)

I'm not surprised by what's in the doc, but am surprised they feel it is patent worthy.

This 189 message thread spans 7 pages: 189 ( [1] 2 3 4 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved