Forum Moderators: Robert Charlton & goodroi
An analogy of a Googly solution is how Google does language translations. The traditional approach called for hiring linguists, compiling translation rules, and designing a translation engine fueled by those rules. A small group of Google engineers took a very different approach and found a better way: Given that Google has access to millions of documents which are each translated into multiple languages, Google goes through those documents in an automated fashion, looks at translation patterns, records them, and applies the right combinations of those patterns to translating new phrases and documents. This approach has led to faster, lower-cost, and higher quality translations and a much more scalable model than the conventional approach. We need to think differently also in the fraud-risk prevention area.
"The Googly solution" described here is very similiar to black-box and bayesian methods. I remember discussions on WW about possibility of this kind of elements in the algo. For example. someone even suggested this is a reason that using meta keywords might increase your probability to be a spammer.
In my opinion, this job offering description confirms that Google is using this kind of elements in the algo, and this leads to serious consequences for SEO.
Maybe it has been already known, or maybe Google doesn't care, but I remember how little about algo is being said by GoogleGuy, so maybe this job description accidentally says too much?