|Google Research Director Peter Norvig - on Being Wrong|
Slate has published an interview with Google's Peter Norvig that goes a bit beyond the usual piece of fluff that often gets churned out. Understanding what goes on at Google, right now, can be important for today's web marketer. Norvig shared a personal anecdote that happens to me all the time, both about the web as a whole and specifically about Google"
|I was in a meeting a while ago and somebody was discussing a new project—this was in an area I hadn't touched for a while—and I said "Oh, isn't it the case that such and such?" And they kind of snorted derisively and said, "Yeah, well, that's the way the Web was four years ago, but that approach doesn't work anymore." I think that's happening constantly. |
There's a lot here - for example, how Google plans for both hardware and software failures in the process of returning a set of search results. A hardware failure somewhere in the chain might change position #10 but they try very hard not to let it affect position #1
The Wrong Stuff [slate.com]
Here is an interesting quote from the interview:
|It sounds like page rank uses consensus as a stand-in for credibility. That slippage is hardly unique to Google-all of us use consensus as a stand-in for credibility sometimes-but it can be pretty misleading. |
Yeah, that's always a problem. One way we try to counter that is diversity. We haven't figured out any way to get around majority rules, so we want to show the most popular result first, but then after that, for the second one, you don't want something that's almost the same as the first. You prefer some diversity, so there's where minority views start coming in.
So if your content is similar to that of the leading site in your niche, it doesn't necessarily assure high rankings.
The point I was going to make from that same quote is this wording:
|...so we want to show the most popular result first... |
Interesting. Not the "correct" result. Not the "most relevant" result. Not the "best" result.
The "most popular" result.
I often analyze search results and see one in there that seems like it came in from another planet - 100x fewer backlinks, the single informational site in the middle of 9 product pitches, etc. IMO, that's a "QDD" (Query Deserves Diversity) flag at work for the query term.
|I often analyze search results and see one in there that seems like it came in from another planet - |
I agree with that completely!
have you ever searched for a keyword or two and find a site in the top ten that had the keyword(s) in the domain name and NO content? (Except for the "This Site Under Construction" graphic)?
Came across that a couple of times.
have also come across a couple of sites in the top ten for keywords that had NO SEO whatsoever. Used frames for navigation and the page title had the default name of the content management system in it. This was for a keyword1 keyword2 city type of search, and admittedly the site did fit that criteria, but you would never know it by the on-page content.
|Interesting. Not the "correct" result. Not the "most relevant" result. Not the "best" result. |
The "most popular" result.
seems like a bit of a chicken and egg situation. surely the "most popular" result is going to be the one at the top of the serps. that's the one that's going to get all the traffic.
how do you become more popular? by getting to the top of the serps. and how do you get to the top of the serps? by becoming more popular.
Yes that is a question, how do they determine what is the most popular result?
Is the CTR+Low Bounce Rate from:
Previous Search Impressions
G Webmaster Central & Analytics, etc.
right now also coming into the game or other parameters are considered?
|how do they determine what is the most popular result? |
To keep the thread simple - It's Google and they have an armada of web beacons reporting on what's being visited and by whom. Chrome, toolbars, adsense adverts and other Google products all record your behavior. An increase in beacons reporting about page A = page A must be trending towards popularity. SEOMoz did an excellent writeup on Google data gathering here - [seomoz.org...]
Thanks for link. I knew that G collects a lot but this much is unprecedented in history and very disturbing.
Going back to the discussion:
Can we say that G is adding more weight to "the popularity parameter" and that is the reason why less popular (less visited) pages from high ranking, content rich authority websites are being dropped off from search results?
Is "the most popular" page from the G point of view the most visited page, where visitors stay long time?
In context, it seems to me that the "most popular" page means the one with the strongest citations or links.
One thing that struck me was that this interview confirms our feeling of being Google's laboratory subjects. Sometimes it feels like they've made a mistake in our case - and sometimes they have! Errors like that are all part of their vision.