Forum Moderators: Robert Charlton & goodroi
Legally, I can not imagine that Google is violating any law, but it is absolutely ridiculous. If the Goal of Google is to have a great search experience for the user, shouldn't the user expect to get the "Coca-Cola" website when they search for it.
I sell products and services on this particular site and it is a legal registered trademark with the USPTO. I have a loyal following and I get a lot of searches to my brand name. If Google wants users to have a good experience, they should ensure that the domain people are looking for goes to the top of the query.
Vent -> I am really sick of the power Google holds over webmasters. Everything from design to wasting countless hours on issues like this. I hope MSN kicks their butt in the next couple of years. And I believe they will. They always win!
[edited by: Kangol at 9:45 pm (utc) on Feb. 7, 2005]
- why google is making these updates always "live" and not in the background and than, when finished, displaying it to the surfers? I think it is not really nice for the normal surfer if they do not find what they are looking for! The results re crazy. In my business we have different categories. For example: blue widgets, hot widgets, no widgets. And normaly if you enter hot widgets in google, the no and the blue widgets are never listed. But now if you enter hot widgets, you can see hot, blue and no widgets on the first site in the serps.
I don't know why it is so difficult for google to make their updates in the background and not to show such bad serps to the surfers....
Probably because there is no way to do a small run that would be representative of the full population, where the "population" is the billions of web pages that Google has. So anything they try in a microcosm (in the background), would have very little chance of scaling up to the real search universe because the keyword statistics of the microcosm population would have very little resemblance to the real search universe.
So instead, they probably run multiple tests in parallel, with different versions of the algorithm on different data centers. Analyze what works best, cross populate it to the other servers, and keep doing this until they get something they "like".
The behavior being seen is very much like that of doing a genetic algorithms run. If you want more info, just punch "genetic algorithms" into Google and start reading.
Wild speculation on my part of course.