tedster - 3:16 am on Feb 18, 2013 (gmt 0)
tedster - why should a machine learning algo degrade all the former realy good sites to page 50+ even if they are white hat and do have a very good UE, metric ...?
Because it is a "learn-ing" algo and not a "learn-ed" algo, I'd say. I will be very much different in another two years or so.
Google would have already pointed out that they know changed their system to machine learning algo!
If you read their comments in a certain way, Amit Singhal and Matt Cutts have been dropping big clues about this for well over a year. They certainly don't owe us a clear look at their algorithm. I'm just happy they say anything at all - there could be no communication whatsoever!
And even a machine learning algo needs some human direction input?!
Certainly. For one thing, humans decide what it needs to learn. And then humans create a seed set. And finally, humans measure the level of success or failure. A lot of that goes on even before a new module of the algorithm goes live.
All of this means we are no longer able to isolate specific factors for the algorithm changes very well. However, through this thread we can still tell WHEN an update has happened. It's just that we're better off looking for the target ideas for the see sets - what kind of things Google is trying to reward in the sites that rank better after that update. Of course, as long as the machine learning is in a primitive state, that can be very difficult.