Welcome to WebmasterWorld Guest from 126.96.36.199
...we used our standard evaluation system that we've developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: "Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?"
There was an engineer who came up with a rigorous set of questions, everything from. "Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?"
...we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons.
Did you see how Matt Cutts responded to Wired about Suite101, pure arrogance.
I don't see why all these people think they were entitled to high rankings in google. To me, many of them seem to be ranking about WHERE THEY SHOULD HAVE BEEN RANKING ALL ALONG.
I think you look for signals that recreate that same intuition, that same experience that you have as an engineer and that users have. Whenever we look at the most blocked sites, it did match our intuition and experience, but the key is, you also have your experience of the sorts of sites that are going to be adding value for users versus not adding value for users. And we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons …
My current take is that we have a new algo that IS misfiring in many cases - more than we've seen in any update I can remember. Google apparently knows that, too. They are asking for examples of false positives [webmasterworld.com] - a rather public admission that the new algo has problems.
Most of the "articles" on Britannica are only 2 or 3 sentences surrounded by ads (this is because you have to subscribe to get the full article). If this update was about thin content or too much ad real estate, then I would have expected Britannica to fall in the rankings.
But if I ran Google I would have an exceptions file which would contain Brittanica & NYT and others. There are some sites that you simply have to have in your SERPS.
So rather than a failure or a mere PR stunt
Yep, it's coming. Google called it "a new layer"
As for the New York Times, why is that news outlet necessary? There are a great many people who, for political reasons, refuse to read it, and there are news outlets of better quality.
I wish the rules here allowed me to walk you through some of the results I'm tracking across different niches. I don't think you'd be so flippant with your remarks.
Is it okay for the same company to buy and use domain names with multiple extensions seemingly to monolpolize the first page of Google search results?
Gooooooglebot is going nuts on my site right now. Absolutely nuts, it might get 100% of the pages today if the trend continues (60% so far)<Fiction>; since it was suggested here that Google now classifies page quality (in addition to page relevance), the bot was sent out again for this important mission :)