Forum Moderators: Robert Charlton & goodroi
What, people would be involved? Yes indeed -- check this quote out, for example:
For each web page/site identified as favored and non-favored, the editors may determine an editorial opinion parameter for that site... For each web page in the result set that is associated with one of the web sites in the set of affected web sites, the server may determine an updated score using an editorial opinion parameter for that web site.US Patent 7096214 [patft.uspto.gov]
19. The method of claim 17 further comprising: determining, for each query theme, one or more topics for determining whether a search query satisfies the respective query theme.
20. The method of claim 19 wherein the one or more topics are selected from at least one hierarchical directory.
This sounds like using DMOZ, Google directory or Yahoo directory data to me. That is what Google did openly some years ago. Remember those category lines under the results in the SERPs when Google found the page in DMOZ?
1. The ability to systematically extrapolate from a very small human-judgment-based data set, applying the knowledge gained from that data set to a vastly larger set of documents and searches.
2. The ability to efficiently integrate the results of that part-human process into the rest of their algorithmic process, thereby enabling them to adjust billions of SERP results in real time.
This suggests it might be prudent to take Google at their word when they recommend site owners and developers should invest in improving the quality of their site content, rather than investing in SEO tricks and Spam techniques designed to trick Google.
Whether you choose to take this advice or not, you've been forewarned. At a bare minimum, this patent award confirms that the advice isn't 100% disinformation -- it strongly suggests that Google is in fact investing in research and trying hard to improve their ability to differentiate between good sites/document and fake or poor quality sites/documents -- even if it requires them to rely on humans to some limited extent.
It's just the completed patent that covers one more variable (Google Co-op) to throw in the data mining stew that mechanically cranks out the revised ranking formula on a regular basis.
They could simply put a link on each link and let USERS evaluate if the site matched the search term: if you searched for "Lexus reviews," was the link you just clicked on useful? That maybe confusing and maybe instrusive, but who knows...maybe do it only for the people who have a google account, or who agree in advance to "help" google.
What would be better is ... say they do a search, click on a link and click back to google. Maybe they didn't find what they were looking for then something pops up for them to get feedback on why they bounced back to their search results. They should definately have a google account to do so.
The questions is whether it will affect Pagerank only or actual search results as well and how Google can 100% trust those editors opinion,there will always be a subjective human factor.
The questions is whether it will affect Pagerank only or actual search results as well and how Google can 100% trust those editors opinion,there will always be a subjective human factor.
PageRank itself is based on a subjective human factor: how people link.
If Google were to use paid editors to grade sites for seeding or benchmarking purposes, it could simply hire enough editors to cancel out the effects of individual editors' biases.
That's making a big and fallicious assumption: that it's PEOPLE making the links.
I don't think P. Steiner was being literal when he wrote: "On the Internet, nobody knows you're a dog."
A link is not an indicator about what you think of the quality of the site. People link a fair bit to businesses that have ripped them off too...
Sure, and people link to sites that swap links with them, that have bought links from them, that they own, etc. But linking is still a "human factor" that influences PageRank and search results, which is another way of saying that Google has been influenced by human judgment (more specifically, human linking judgment) since the first Google server was housed in Lego blocks.
The use of human editors to rate seed sites, supply positive or negative examples, etc. is an evolutionary step, not something altogether new. And even if it were completely new, so what? The important thing is to have a scalable mechanism for assembling the best possible SERPs from crawled and indexed data.
"Editorial opinion" needn't relate to editors hired directly by google to clean up the serps. It could relate third-party data like inclusion in DMOZ, user reviews on Alexa or some realted sites, known spam host lists etc.
True, but outsourcing "editorial opinion" to third parties would simply perpetuate the problems they've had with PageRank, which can be manipulated by SEOs. And remember, they don't have to rate every site: They just need to rate enough sites to allow "seeding" and profiling.
This is simply a glimpse at the absurdity of the patent system.
I'd love to chat, but I'm off to get the paperwork together for my pattent for "A Text-Based Topical Information Exchange System for Web Design and Web Programming Enthusiasts and Professionals" - This sounds like it could get quite a following.
I've also been toying with a "Chronological Web-Based Information Logging Framework", but I'm not sure if this has mass-market appeal.
Read through the patent if you haven't, its a short read, and I see some interesting things in it.
If this is new, it probably states things to come. A friend I've known for 19 years who is now an search exec at Yahoo told me 2 years ago that something like this would eventually happen. He spoke of using something like epinions as an additional quality ranking factor on websites and pages.
The things outlined here makes sense to combat all the MFA spam going on these days.
On the other hand, if the patent is truly 6 years old, then I agree that maybe its just legal CYA.