homepage Welcome to WebmasterWorld Guest from 54.226.213.228
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 42 message thread spans 2 pages: < < 42 ( 1 [2]     
Four new Google patents - April 2007
Traffic, Query Analysis, Link-Based Criteria, and Document Inception Date
tedster




msg:3322880
 7:30 pm on Apr 26, 2007 (gmt 0)

I just learned about 4 new Google Search technology patents. I will be studying and posting about them -- feel free to get there ahead of me, I'm not sure how soon I'll have the time to dig in. That first one looks particularly interesting, doesn't it?

1. Document Scoring Based on Traffic Associated with a Document [appft1.uspto.gov] [April 19, 2007 - Steve Lawrence]

2. Document Scoring Based on Query Analysis [appft1.uspto.gov] [April 19, 2007 - Jeffery Dean]

3. Document Scoring Based on Link-Based Criteria [appft1.uspto.gov] [April 26, 2007 - Anurag Acharya]

4. Document Scoring Based on Document Inception Date [appft1.uspto.gov] [April 26, 2007 - Matt Cutts]

 

crates




msg:3328629
 2:45 pm on May 2, 2007 (gmt 0)

A system, comprising: means for identifying a document that appears as a search result document for a plurality of discordant search queries; means for determining a score for the document; means for negatively adjusting the score for the document; and means for ranking the document with regard to at least one other document based, at least in part, on the negatively-adjusted score.

Okay, so that's what Google's patenting... let me ask you guys this:

Does this prevent other people from developing a system that does the exact same thing? Or just from developing a system that does the same thing as what they're doing in the same fashion that they're doing it?

In other words, can other companies use (and patent) their own proprietary algorithms to score documents in a similar manner?

mattg3




msg:3328632
 2:51 pm on May 2, 2007 (gmt 0)

I am waiting for the highly complicated patent that can fgrep -v the PHPSESSID out of urls .... so no one has duplicate incoming links. That's a hard one. Gonna need a few PhDs minimum ..

mattg3




msg:3328641
 2:58 pm on May 2, 2007 (gmt 0)

In other words, can other companies use (and patent) their own proprietary algorithms to score documents in a similar manner?

I love the one where they patented humn input into search results ... next one will be the use eyes into the search result...

callivert




msg:3328936
 6:24 pm on May 2, 2007 (gmt 0)

Does this prevent other people from developing a system that does the exact same thing

i can't see why others can't use similar techniques. After all Latent Semantic Analysis is patented (and not to google, yahoo or MSN), and everyone seems to feel its okay to use "variants". Plus, who would know. Secret sauce and all that.
Anyway, how can you patent "ranking a site according to how much traffic it gets." Alexa has been doing that for years.

mattg3




msg:3328978
 7:14 pm on May 2, 2007 (gmt 0)

"ranking a site according to how much traffic it gets."

Did they mention the whole self fulfilling prophecy thing in that patent? If it's on page 1 it tends to get more traffic, more links more popularity. The whole art industry lives of someone telling others that green splatters in that variation are great..

Well i guess they are clever enough to somehow normalise their traffic out.

tedster




msg:3328995
 7:27 pm on May 2, 2007 (gmt 0)

If a url ranks on page 1 and gets less traffic than the benchmark for such a position, it will fall. That works in the opposite direction. I think a factor like this is in play when brand new pages and even new domains get a honeymoon period of excellent rankings, only to fall quickly down.

mattg3




msg:3329001
 7:32 pm on May 2, 2007 (gmt 0)

That's what I ment :)

Yet what an effort to have a normal (hopefully a mean with variation) on each search term...

Would probably better to do return times adjusted by server speed.

tedster




msg:3329016
 7:58 pm on May 2, 2007 (gmt 0)

what an effort to have a normal (hopefully a mean with variation) on each search term...

I'm pretty sure Google's already been doing that work for quite a few years, and that they keep it fresh, too. Probably not on "every" search term as a separate data point, but on many high volume searches -- and most likely classified by some system they evolved from significant parallels they noticed in their initial data set.

This kind of traffic data would also be important for them to measure user satisfaction with the SERPs and any time. If the current data after an algo tweak fell off from that generated by earlier benchmarks, that would be essential feedback.

mattg3




msg:3329227
 10:23 pm on May 2, 2007 (gmt 0)

On that topic what is an average return percentage as in measured in analytics? I have 30% in a relatively wide field which I thought is OK. I had worse in the past so the general trend was up. It was 25% on 85.000 people a day and 300.000 PI a day. I am not entirely sure how high that return rate has to be to be deemed a non spam experience. We are now on roughly 6 pages per visitors and that's on trusted users that come in directly. It was 5.6 on high traffic. Not sure what Google wants ... CTR decreased that was the only value that decreased. More return visitors get ad blinder to your spots. From these user related values only ctr and ecpm (which are obviously related) went down. So if the 950 on my site was user related it would more likely be linked to adsense then normal user satisfaction. Interestingly. CTR started falling 60% 2 weeks before the latest Google "freak event". I always wondered if Google having so many adwords to sell and haing continuing profit increases over 50% does not rank the SERPS according to demand. Summer and spring time are the times when students get less interested so I assume the ad buyers know that too.

Ads are 100% targeted still ..

I mean how can you not work after supply and demand ...?

callivert




msg:3330132
 8:18 pm on May 3, 2007 (gmt 0)

when were these patents submitted? A lot of these ideas have been thrown around here at webmasterworld, for example in the "signs of crap" thread. There were lots of ideas proposed for evaluating site quality. I distinctly remember discussions of tracking queries and click rates against a standard, for example.
They can't patent methods that have been previously described in a public forum.

tedster




msg:3330223
 10:45 pm on May 3, 2007 (gmt 0)

callivert, I think these new patents are not just about the factors being measured (many are old news, as you mentioned) but more about the specific techniques involved and the methods of combining factors.

(Yes, IMO the US needs patent law reform - copyright law reform as well.)

potentialgeek




msg:3331079
 6:55 pm on May 4, 2007 (gmt 0)

2. Query Analysis: which search result gets the click

Page titles and meta descriptions just got more important. Tip: write your title and description as if you were writing an AdWords ad.

p/g

This 42 message thread spans 2 pages: < < 42 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved