vandread - 7:22 am on Mar 8, 2011 (gmt 0)
I personally think that Google is once again leaving webmasters behind and puzzled.
What I do not understand is the following: They say they cannot reveal information about the update because it would be gamed. But what would be the consequence if everyone knew Google's definition of quality? Right, everyone would modify or create sites based on that standard, at least if they want to rank in Google.
The effect? Improved quality all over the board, or at least Google's definition of it.
The problem? People would say that Google's definition of quality is not the holy grail. First, we would be exactly were we began. Content farms rank for many terms, scrapers outrank the sites were the content was originally published and autoblogs flourish.
The update does not appear to be about "content" quality, where content is text or other information that the visitor is searching for. The quality-score that they have added to the algorithm seems to be about unrelated elements, thin content pages that have been written 10 years ago can have a influence on the rankings of a page that's the best there is on a topic. Many say that to many ads, ad placement, and god knows what else can have a influence on the rankings. All of those elements have nothing to do with the key question: Is the page helping the visitor, or is it not.
When I search or research, I want to find answers. I do not care if a page has ten ads on it, if other parts of the site are sub-par or if the design looks like it has been last updated in the 90s. If the information are there, I can live with that.
Now that the algorithm has taken its toll, it can happen that I see sites at the top that look superb, with a low quantity of ads, that unfortunately do not give me the answers that I'm looking for. I'm not saying it is always the case, but it happens more often than before.
And the consequence is that I have to spend more time searching for the answer.