TheMadScientist - 2:39 am on Dec 15, 2012 (gmt 0)
they've also assembled a fairly massive list of webmasters, linked those webmasters to their attributable output, via gwmt, crawling tools, manual sleuthing, assigned ranking scores +ve and -ve, then apply this to the output of the other Algos
Why would they do that?
^ That is a serious question, btw, because I don't understand the reasoning behind why, since 'who did what' isn't what keeps them in business, providing visitors with satisfaction to queries made is.
Their job is to present their visitors with the results that satisfy the visitor's query, so why would they care who built a site any more than they really care who the original author of the content is since they don't simply go by discovery date for determination and give authors a way to submit content or 'new original content URLs' via WMT then send the bot out to 'discover' the new content posted...
Even more to the point:
Why would they need to do that?
I think it's much more likely the group of sites has something in common that doesn't line up with what the algo is looking for than anyone at Google ever caring who actually built the site, let alone writing an algo to 'do anything' because of it.
If the same person built or worked on all the sites and that person is missing some of the optimization points we're talking about in this thread, then it's entirely possible a group of sites is not ranking well because of the person who worked on them, but they don't need a special algo or to figure out 'who done it' for that, it'll take care of itself, but it could definitely look like they're 'out to get you' if you're missing some of the new keys to optimization and don't know what those keys are or don't understand them well enough to implement them for some reason...