IanKelley - 4:27 pm on Oct 1, 2010 (gmt 0)
Of course that's the crux of the matter, how much Google's responsible for it's search suggestions. It's somewhere between not at all and completely. Google's normally pretty teflon on taking any responsibility for anything it does or any impacts it has.
That may be the crux of the matter to you personally, because it's something you want to see.
But it sure looks like the actual crux, in this particular conviction, is that a web entity was penalized for something an automated process did, even though it happened to be 100% correct.
What was really on trial here, even if the court was too ignorant to realize it, was the ability to use an automated algo to suggest things a user may want to try.
As I mentioned earlier, there's nothing in Google's suggestion engine that isn't a relatively straightforward solution to the problem. Similar code is going to exist in the suggestion feature at any large search engine.
And what's wrong with it? It's a logical feature that has value to users. People understand it's not stating a fact. The few that are so completely clueless about technology that they take an alternate search suggestion as a statement of fact are certainly not a good reason to get rid of a valuable feature for everyone else?
I highly doubt anyone would be screaming defamation if the same algo was running at any other site.
If you can be sued for defamation for something done by neutral piece of code, written with no bias whatsoever, then where does that end? Exactly what kind of website can still afford to exist?
The end result of these kinds of cases, were they to become law, is always that a website needs an army of human editors to police their algos, defeating the purpose of automation and making entry into pretty much any web market impossible for a startup.
Fortunately this is only France, where the courts would put the whole internet in timeout if they could :-)