| 12:07 pm on Aug 26, 2006 (gmt 0)|
|19. The method of claim 17 further comprising: determining, for each query theme, one or more topics for determining whether a search query satisfies the respective query theme. |
20. The method of claim 19 wherein the one or more topics are selected from at least one hierarchical directory.
This sounds like using DMOZ, Google directory or Yahoo directory data to me. That is what Google did openly some years ago. Remember those category lines under the results in the SERPs when Google found the page in DMOZ?
| 1:38 pm on Aug 26, 2006 (gmt 0)|
one more thing: just because they filed and got a patent does NOT mean that they will use it. They probably filed it years ago when they thought it was a good idea, and now it may not be such a great idea. Of course, on the other hand it could still be :)
| 2:07 pm on Aug 26, 2006 (gmt 0)|
|Sounds to me they may be considering to offer a VOTE feature YES if you like it, No if you don't... |
That already exists on the toolbar doesn't it?
| 3:46 pm on Aug 26, 2006 (gmt 0)|
I see it as a legal CYA. They now have been issued a patent that says they can remove some pages or sites from the serps by hand for any reason and can't be sued.
Probably it adds ability to use the human touch in many ways and amends their previous statements to reflect a growing and changing google.
| 4:44 pm on Aug 26, 2006 (gmt 0)|
Strikes me that there are two keys to this patent concept.
1. The ability to systematically extrapolate from a very small human-judgment-based data set, applying the knowledge gained from that data set to a vastly larger set of documents and searches.
2. The ability to efficiently integrate the results of that part-human process into the rest of their algorithmic process, thereby enabling them to adjust billions of SERP results in real time.
This suggests it might be prudent to take Google at their word when they recommend site owners and developers should invest in improving the quality of their site content, rather than investing in SEO tricks and Spam techniques designed to trick Google.
Whether you choose to take this advice or not, you've been forewarned. At a bare minimum, this patent award confirms that the advice isn't 100% disinformation -- it strongly suggests that Google is in fact investing in research and trying hard to improve their ability to differentiate between good sites/document and fake or poor quality sites/documents -- even if it requires them to rely on humans to some limited extent.
| 5:21 pm on Aug 26, 2006 (gmt 0)|
It's just the patent they started before they rolled Google Co-op out. No, it's nothing to do with the manual spam finders -- those people help retune the manual SERPs used to "train" the data mining that produces the actual algorithm. No, it's not abandoning "the algo" (whatever the heck people envision when they use that weird phrase).
It's just the completed patent that covers one more variable (Google Co-op) to throw in the data mining stew that mechanically cranks out the revised ranking formula on a regular basis.
| 5:45 pm on Aug 26, 2006 (gmt 0)|
Whether or not they intended it, this is a brilliant legal move!
With all these cases where people are suing because of ranking changes, Google can just pull out this patent and say "someone's editorial opinion ... first amendment ..."
| 6:39 pm on Aug 26, 2006 (gmt 0)|
They could recruit some trusted DMOZ editors for that job. I'm sure they'd do it for free!
| 7:11 pm on Aug 26, 2006 (gmt 0)|
|They could simply put a link on each link and let USERS evaluate if the site matched the search term: if you searched for "Lexus reviews," was the link you just clicked on useful? That maybe confusing and maybe instrusive, but who knows...maybe do it only for the people who have a google account, or who agree in advance to "help" google. |
What would be better is ... say they do a search, click on a link and click back to google. Maybe they didn't find what they were looking for then something pops up for them to get feedback on why they bounced back to their search results. They should definately have a google account to do so.
| 7:24 pm on Aug 26, 2006 (gmt 0)|
[edited by: UK_Web_Guy at 7:26 pm (utc) on Aug. 26, 2006]
| 7:45 pm on Aug 26, 2006 (gmt 0)|
Interesting but people were arguing if such a editorial team exist at Google for a long time now,now it seems that it will become reality.
The questions is whether it will affect Pagerank only or actual search results as well and how Google can 100% trust those editors opinion,there will always be a subjective human factor.
| 7:55 pm on Aug 26, 2006 (gmt 0)|
|The questions is whether it will affect Pagerank only or actual search results as well and how Google can 100% trust those editors opinion,there will always be a subjective human factor. |
PageRank itself is based on a subjective human factor: how people link.
If Google were to use paid editors to grade sites for seeding or benchmarking purposes, it could simply hire enough editors to cancel out the effects of individual editors' biases.
| 9:35 pm on Aug 26, 2006 (gmt 0)|
|PageRank itself is based on a subjective human factor: how people link. |
That's making a big and fallicious assumption: that it's PEOPLE making the links.
| 9:41 pm on Aug 26, 2006 (gmt 0)|
|That's making a big and fallicious assumption: that it's PEOPLE making the links. |
I don't think P. Steiner was being literal when he wrote: "On the Internet, nobody knows you're a dog."
| 12:07 am on Aug 27, 2006 (gmt 0)|
>> That's making a big and fallicious assumption: that it's PEOPLE making the links.
A link is not an indicator about what you think of the quality of the site. People link a fair bit to businesses that have ripped them off too...
| 12:53 am on Aug 27, 2006 (gmt 0)|
|A link is not an indicator about what you think of the quality of the site. People link a fair bit to businesses that have ripped them off too... |
Sure, and people link to sites that swap links with them, that have bought links from them, that they own, etc. But linking is still a "human factor" that influences PageRank and search results, which is another way of saying that Google has been influenced by human judgment (more specifically, human linking judgment) since the first Google server was housed in Lego blocks.
The use of human editors to rate seed sites, supply positive or negative examples, etc. is an evolutionary step, not something altogether new. And even if it were completely new, so what? The important thing is to have a scalable mechanism for assembling the best possible SERPs from crawled and indexed data.
| 2:56 pm on Aug 27, 2006 (gmt 0)|
Sometimes companies get patents not because they want to use them, but because they don't want others to.
| 2:56 pm on Aug 27, 2006 (gmt 0)|
"Editorial opinion" needn't relate to editors hired directly by google to clean up the serps. It could relate third-party data like inclusion in DMOZ, user reviews on Alexa or some realted sites, known spam host lists etc.
edit: sorry lammert, missed your earlier post stating much the same thing
| 3:06 pm on Aug 27, 2006 (gmt 0)|
|"Editorial opinion" needn't relate to editors hired directly by google to clean up the serps. It could relate third-party data like inclusion in DMOZ, user reviews on Alexa or some realted sites, known spam host lists etc. |
True, but outsourcing "editorial opinion" to third parties would simply perpetuate the problems they've had with PageRank, which can be manipulated by SEOs. And remember, they don't have to rate every site: They just need to rate enough sites to allow "seeding" and profiling.
| 8:49 pm on Aug 27, 2006 (gmt 0)|
This is about video. Youtubes search results are horrible and there is no algo that can look inside a video and help to rank it.
It would be great if Google would use this to manually manipulate the SERPs and perhaps they will. But this is more about ranking videos.
| 3:06 pm on Aug 28, 2006 (gmt 0)|
Google applying or a pattent for sorting search results based on user rating - I guess they've never seen the "Sort By User Rating" option on most shopping sites - or sites like Digg, whose model is built around user input.
This is simply a glimpse at the absurdity of the patent system.
I'd love to chat, but I'm off to get the paperwork together for my pattent for "A Text-Based Topical Information Exchange System for Web Design and Web Programming Enthusiasts and Professionals" - This sounds like it could get quite a following.
I've also been toying with a "Chronological Web-Based Information Logging Framework", but I'm not sure if this has mass-market appeal.
| 4:01 am on Aug 29, 2006 (gmt 0)|
Looks to me like a blending of some future Google theme query editorial team plus autmoted review of whether or not a site has a DMOZ or Yahoo listing.
Read through the patent if you haven't, its a short read, and I see some interesting things in it.
If this is new, it probably states things to come. A friend I've known for 19 years who is now an search exec at Yahoo told me 2 years ago that something like this would eventually happen. He spoke of using something like epinions as an additional quality ranking factor on websites and pages.
The things outlined here makes sense to combat all the MFA spam going on these days.
On the other hand, if the patent is truly 6 years old, then I agree that maybe its just legal CYA.
| 6:22 am on Sep 4, 2006 (gmt 0)|
They should limit software or internet patents to 2 years instead of the normal 16 years, this patent by Google (+hundreds others) is just another attempt to put a strangle hold on development and to stifel creativity. Most of thier patents are not novel ideas. With thier powerful attorneys and deep pockets they can do it.
| This 53 message thread spans 2 pages: < < 53 ( 1  ) |