There seems to be just a little that we can glean from the piece - clearly meant for a mass audience. This bit caught my eye:
|He then unveiled his team’s solution: a mathematical model that tries to determine when users want new information and when they don’t. (And yes, like all Google initiatives, it had a name: QDF, for “query deserves freshness.”) |
...THE QDF solution revolves around determining whether a topic is “hot.” If news sites or blog posts are actively writing about a topic, the model figures that it is one for which users are more likely to want current information. The model also examines Google’s own stream of billions of search queries, which Mr. Singhal believes is an even better monitor of global enthusiasm about a particular subject.
"The model also examines Google’s own stream of billions of search queries, which Mr. Singhal believes is an even better monitor of global enthusiasm about a particular subject."
That is what John Battelle calls "the database of intentions"
Geez, that's just great, eh? Even more about Br*tney and P*ris on the internet. How could we get by without G giving us up to date results on the latest direction the lemmings are running in?
The sooner G gets shifted out of its number one spot, the better.
Matt Cutts has blogged about the article now. He says he likes it, and he highlights 5 points that he feels are useful communications that are possibly new for some people -- including this one below, (as if anyone here had any doubts about how oten something changes in the Google SERPs)
|Google makes a go/no-go decision on several different quality changes each week. |
This is my favourite bit:
|Mr. Singhal often doesn’t rush to fix everything he hears about, because each change can affect the rankings of many sites. “You can’t just react on the first complaint,” he says. “You let things simmer.” |
With all of the acquisitions [webmasterworld.com ] and all the news around G, it certainly shows they are paying a great deal of attention to their core competency of search.
Matt's blog, referencing the article, says G
is a first.
|makes about a half-dozen major and minor changes a week |
For the virtual fly-on-the-wall reporter Saul Hansell, what a great opportunity. The story may be for the masses, but parts of it are certainly worthy of sending a client or two that just don't get it!
Great article i always wondered about the hot filter showing results of blogs and news storys so fast.
Doesn't anyone else find this really "interesting"?
|But last year, Mr. Singhal started to worry that Google's balance was off. When the company introduced its new |
stock quotation service, a search for "Google Finance" couldn't find it. After monitoring similar problems, he
assembled a team of three engineers to figure out what to do about them.
Basically says flat out that they give their own sites preferential treatment. Not like I didn't suspect that, but I don't know if they've said it publicly before.
[edited by: tedster at 5:07 pm (utc) on June 4, 2007]
[edit reason] fix character set issue [/edit]
Er, no, it doesn't.
It even says they looked at 'similar problems'; the problem wasn't that their site didn't beat the competition - the problem was, it didn't figure at all.
Whoever owned the site, that's a search problem.
But I guesss it's all down to how you choose to interpret, huh? ;)
|But I guesss it's all down to how you choose to interpret, huh? |
I guess you're right. "find it" can be taken a few different ways. I read that to mean placing well in the results...it's really not clear.
But still, the tail is wagging the dog. When it's one of our sites that is not showing up, Google doesn't change their algo to make sure we can be found. If the end result of the algo change is better for everyone...great...but it's a slippery slope...there is a clear conflict of interest when they use one of their own sites as a search quality indicator.
|But still, the tail is wagging the dog. When it's one of our sites that is not showing up, Google doesn't change their algo to make sure we can be found. |
How do you know that? Chances are, Google has any number of benchmark sites or pages to use as QC checks. (We know that Google hires "quality evaluators," for example.)
|If the end result of the algo change is better for everyone...great...but it's a slippery slope...there is a clear conflict of interest when they use one of their own sites as a search quality indicator. |
Not really. And even if it were, so what? Google Search is Google's search engine, not a public utility. One would expect Google properties to rank at the top for relevant searches--although they don't, to judge from the fact that Google isn't even in the top 10 results for "search engine" and is outranked by Yahoo for "search." :-)
Plus the same article makes clear that they do look at all 'fail search' reports ... not just their own.
I thought it was an interesting example, and quite funny, that not only does one of their sites fail to show, and play a part (along with others, please note), in triggering an algo tweak ... but they tell NYT about it.
Would you really be happier if they left the site out in the cold?
Google keep tweaking again and again... so the question is how to stay stable during this storm?
I think the saying is 'don't push your luck' or maybe 'don't walk too close to the cliff edge'
We are in an age of constant change - and that's one thing that will not change!
But the vast majority of Google's tweaks are very minor; if you follow their guidelines, and don't spend half your life trying to see 'what you can get away with', then you won't need to spend the other half looking over your shoulder.
Don't worry about Google; build the best you can for your target audience, without looking to cheat anyone, and you'll do OK. Read the guidelines carefully, and you can do a little better ;)