homepage Welcome to WebmasterWorld Guest from 54.167.179.48
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 53 message thread spans 2 pages: 53 ( [1] 2 > >     
Google Patent - human editorial opinion
supporting editorial opinion in ranking search results
tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 1:36 am on Aug 26, 2006 (gmt 0)

Google was granted a new patent last Tuesday, August 22. The title is "System and method for supporting editorial opinion in the ranking of search results".

What, people would be involved? Yes indeed -- check this quote out, for example:

For each web page/site identified as favored and non-favored, the editors may determine an editorial opinion parameter for that site... For each web page in the result set that is associated with one of the web sites in the set of affected web sites, the server may determine an updated score using an editorial opinion parameter for that web site.

US Patent 7096214 [patft.uspto.gov]


 

KenB

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 1:43 am on Aug 26, 2006 (gmt 0)

Maybe we could get a little more explaination as to what this patent is about. I can't tell if it is about adding editorial comments to SERP results or modifying something like PageRank by looking at if a reference to a site was postive or negative.

Sometimes I wonder if Google doesn't file patents just to muddy the SEO waters.

digitalghost

WebmasterWorld Senior Member digitalghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 1:49 am on Aug 26, 2006 (gmt 0)

Definitely going to have to read this one when I don't have so much going on.

A server improves the ranking of search results. The server includes a processor and a memory that stores instructions and a group of query themes. The processor receives a search query containing at least one search term, retrieves one or more objects based on the at least one search term and determines whether the search query corresponds to at least one of the group of query themes. The processor then ranks the one or more objects based on whether the search query corresponds to at least one of the group of query themes and provides the ranked one or more objects to a user.

Huh? >> based on the at least one search

Rush to the patent office?

UserFriendly

5+ Year Member



 
Msg#: 3060951 posted 1:50 am on Aug 26, 2006 (gmt 0)

I'm guessing it means that, should they want to, Google can hire teams of people to cleanse the results pages for valuable and popular keywords. Plus they could weight results in favour of Google-friendly companies.

How that would warrant a patent, I don't know. It would be a tiny change to the existing results code. Seems the US Patent Office will give out patents to anyone with a bit of cash.

KenB

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 1:52 am on Aug 26, 2006 (gmt 0)

How that would warrant a patent, I don't know. It would be a tiny change to the existing results code. Seems the US Patent Office will give out patents to anyone with a bit of cash.

I think our patent system is so underfunded that it would give out a patent to anyone who put together a technical sounding paper. Just look at some of the patents that have been invalidated in the past few years.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 2:21 am on Aug 26, 2006 (gmt 0)

Here's my current, and very simplistic reading --

1. Get a bunch of people to find and rate really good websites and really spammy websites for certain searches
2. Make their rating into a parameter
3. Look at what the algo says the top results "should" be for a particular search
4. See if that search is in one of the topic areas that has an editorial rating
5. If so, look to see if there is some relationship to either the good guy list or the bad guy list
6. Shift the search rankings according to whatever parameter the editors generated.
7. Serve the shifted results to the user.

digitalghost

WebmasterWorld Senior Member digitalghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 2:26 am on Aug 26, 2006 (gmt 0)

What I find immediately notable, if I'm interpreting this correctly, is that it breaks with Google's policy of processing everything via algorithm, and introduces a defined human element.

shri

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 2:31 am on Aug 26, 2006 (gmt 0)

>> introduces a defined human element.

Matt Cutts had blogged that or mentioned that in a post about a year or so ago (and may be mentioned it at a pubcon). I think the gist of it was "if it can be made scalable, human elements will be used".

Yeah, I'm probably misquoting him, but I do recall his response being vague enough to the point where a misquote does not matter. :)

And then there was the webspam leak from a few months ago involving that process paper / students in Germany (?).

digitalghost

WebmasterWorld Senior Member digitalghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 2:33 am on Aug 26, 2006 (gmt 0)

With almost unlimited resources, adding some carbon based units is scalable. ;)

europeforvisitors



 
Msg#: 3060951 posted 3:03 am on Aug 26, 2006 (gmt 0)

What I find immediately notable, if I'm interpreting this correctly, is that it breaks with Google's policy of processing everything via algorithm, and introduces a defined human element.

They've always used a human element. PageRank, Google's original Unique Selling Proposition, is a formula based on humans' decisions (in theory, editorial decisions) when linking.

I wonder if this isn't related to the Google manual for spam raters that was leaked by a Dutch SEO blog last year, and which has been mentioned a few times in this forum?

KenB

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 3:05 am on Aug 26, 2006 (gmt 0)

With almost unlimited resources, adding some carbon based units is scalable.

Pigeons are really cheap carbon based units and if I recall correctly I remember reading that PigeonRank was the real meaning to the acronym "PR". ;)

digitalghost

WebmasterWorld Senior Member digitalghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 3:09 am on Aug 26, 2006 (gmt 0)

EFV, Google has repeatedly stated that they prefer 'automated' solutions. This is the FIRST time they've broken with that, IF, I'm reading that patent correctly.

It's quite obvious that any system created by humans can't entirely eliminate the 'human' element. But adding active human editors is a switch from their previous PR.

[edited by: digitalghost at 3:26 am (utc) on Aug. 26, 2006]

born2drv

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 3:22 am on Aug 26, 2006 (gmt 0)

Sounds to me they may be considering to offer a VOTE feature YES if you like it, No if you don't...

I think this would be fantastic, especially for websites like mine that shine above most of my competition and have unrivaled product diversity and helpful content.

I would love a "ban" feature... ie. "Don't ever show me this stupid webpage again" ... because lets face it, when we are trying to find something difficult on the web we often do dozens of searches and the last thing we want to see is the same results from before.

They could even take the voting concept one step further by trying to get authority of the voters kind of like pagerank now... for example they could have paid web spam reporters at the googleplex.... and then based on the spam / votes of the google users compared to their own trusted spam reporters, they could give higher authority to those that find the best spam at the highest rate.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 3:55 am on Aug 26, 2006 (gmt 0)

I just made an interesting find -- eWeek covered the new Google patent [googlewatch.eweek.com] on Wednesday, but their take on it was much more in a web 2.0, "social search" direction. My reading is more like what shri and europeforvisitors mentioned above -- the eval.google.com story from June 2005 [webmasterworld.com]

jk3210

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 3:57 am on Aug 26, 2006 (gmt 0)

My take: It means that the scoring of a *page* can be modified by a prior human opinion of the entire *site*, even though a human has never rendered an opinion on the page in question.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3060951 posted 4:01 am on Aug 26, 2006 (gmt 0)

Good observation, jk3210, and a potential loophole if that's the way it really is.

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3060951 posted 4:04 am on Aug 26, 2006 (gmt 0)

...introduces a defined human element.

Seems to me Google has done this with Google Co-Op, where they're using humans with "specific expertise" to suggest and tag authoritative sites. Other humans subscribe, and perhaps their usage patterns become a measure of quality. When results become good enough, they appear on the main page.

Strikes me that this is a hybrid people-seeded/algo-driven variant of TrustRank. The refinments to Google's health and medical results produced by this system are impressive.

Using a human-selected set of results as a seed, btw, goes back to Google's early days. I feel that they owe a lot of their initial quality to the directories, Yahoo and then ODP, that they were able to spider.

They're probably always looking for dependable seeding mechanisms to provide cores for TrustRank-type overlays to their algo.

[edited by: Robert_Charlton at 4:06 am (utc) on Aug. 26, 2006]

smells so good

5+ Year Member



 
Msg#: 3060951 posted 4:07 am on Aug 26, 2006 (gmt 0)

and the last thing we want to see is the same results from before
Not to stray too far OT, but I got a sale from someone doing just that, according to him for the last three years. He was waiting for the right time. There are many valid reasons to search for a page.

Carbon based input would be welcomed. I haven't read the patent yet, does it include an automated system to deal with site owners?

colin_h



 
Msg#: 3060951 posted 4:55 am on Aug 26, 2006 (gmt 0)

Sounds a bit like Garry Kasparov vs IBM's Deep Blue to me. "It doesn't play like a computer, it feels as though it has been guided by a human hand".

I can hear it now ... "why don't they release the logs?"

;-)

Bewenched

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3060951 posted 5:21 am on Aug 26, 2006 (gmt 0)

I wonder if it's going to be like a ranking ability by searchers. If so .. it's already being done by other companies and how can they patent that?

walkman



 
Msg#: 3060951 posted 6:17 am on Aug 26, 2006 (gmt 0)

google long ago gave up on algo alone; GoogleGuy even made a comment on that when the google employeed students rating sites story broke.

They could simply put a link on each link and let USERS evaluate if the site matched the search term: if you searched for "Lexus reviews," was the link you just clicked on useful? That maybe confusing and maybe instrusive, but who knows...maybe do it only for the people who have a google account, or who agree in advance to "help" google.

If this is a new idea, can I get #1 rank for 12 months ;)?

netchicken1

5+ Year Member



 
Msg#: 3060951 posted 7:32 am on Aug 26, 2006 (gmt 0)

In the google sitemaps section, they have been inserting 3 choice options as to how usefull the tools are - Good / Okay / poor (I think)

I imagine that those simple smilie faces could easily be added to the search results and people could rate how good the page is they accessed.

It would be a wonderful addition to the results cutting out the adsence scammers by being able to rank them down.

If you couple that with the SERPS then people would have a tool to help rank sites on a scale that would too big for individuals to cheat.

norton j radstock

10+ Year Member



 
Msg#: 3060951 posted 8:14 am on Aug 26, 2006 (gmt 0)

This is a really significant move on the part of Google -in essence it is similar to teaching a computer to recognise your handwriting -you don't need to instruct it on every piece of information, but rather just give it a big enough sample.

So how might this work? Let's say a team of human editors are scoring sites according to the Google manual and so called 'thin affiliate' or scraper sites are being given low scores. The computer then works out what the common factors are for those low scoring sites and makes that part of the algorithm.

Thus in simple terms it might calculate, for example, that scraper sites are commonly composed in the form of a directory of links followed by 20-30 words of text and a line break, all repeated several times, and with a high proportion of those links pointing at similar affiliate URLs. Other sites sharing similar characteristics would then be marked down in terms of ranking of search results.

Inevitably there will be good sites that fall victim to such systems, the ultimate aim of such a system is continual improvement aided by the ongoing human input. The important thing is that the humans are unbiased, so it is far more likely that Google will continue to employ trained expert assessors, rather than leave it up to the web at large.

fjpapaleo

10+ Year Member



 
Msg#: 3060951 posted 8:34 am on Aug 26, 2006 (gmt 0)

Does this mean we can all stop being link wh*res now?

idolw

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3060951 posted 9:00 am on Aug 26, 2006 (gmt 0)

isn't it similar to TrustRank? That one also said about assessing sites by humans.

i am just wondering how are they going to combine human marks with off-site stuff.

i wish we could get access to one of their internal papers for assessors ;)

soapystar

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 9:31 am on Aug 26, 2006 (gmt 0)

excuse me but isnt ranking a site high because another human placed a link to you on their site basically just human ranking anyway?....so now its like them saying well we still want humans to pick out the best sites but now we are using humans that have their own trust ranking....i.e paid or selected!

idolw

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3060951 posted 9:39 am on Aug 26, 2006 (gmt 0)

IMO it is cool, soapystar.
can concentrate on building cool sites more.

anyway, the patent was filed in 2000 so it has already been implemented by G anyway.

norton j radstock

10+ Year Member



 
Msg#: 3060951 posted 9:44 am on Aug 26, 2006 (gmt 0)

You can find the leaked paper by searching for "Spam Recognition Guide for Raters"

lammert

WebmasterWorld Senior Member lammert us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3060951 posted 10:56 am on Aug 26, 2006 (gmt 0)

I imagine that those simple smilie faces could easily be added to the search results and people could rate how good the page is they accessed.

Actually Google did this already long time ago. If you use the Google toolbar, you'll see two faces. One is to "vote for this page" and the other "vote against this page".

The patent in question is not new, and was filed in December 2000. I don't know when the first toolbar was available from Google, but I think that we should see this patent in the context of the technology used in 2000, rather than the current anti spam measures or eval.google.com or the spam rater guide, which are from a later date. Every talk about web 2.0 is not relevant, this patent is six years old.

<added>
Some searching at www.archive.org showed the following:

toolbar.google.com was introduced in February 2001, just two months after the filing of this patent. The vote buttons were first present in october 2002.
</added>

mfishy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3060951 posted 11:33 am on Aug 26, 2006 (gmt 0)

These patents have become amusing.

"scalable" = "cheap" so I would expect this to be done thousands of miles away from the phd filled offices in MV, most likely in a place where English is not the first language.

It wouldn't be very surprising if google started using human review more than they already do. Somewhere along the line, google seems to have completely forgotten about the core concept of relevancy though, so, as a user, I would far prefer to see the odd crap result than 5/10 pages being barely relevant to the search.

This 53 message thread spans 2 pages: 53 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved