Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
I think it has more to do with Google being a search engine, rather than a human edited index of over three billion URLs.
Even a fairly rudimentary attempt to interpret the meaning of sentences with software would be a rather costly excercise on such a massive scale.
I can recognise such stuff just by the URL, without looking at the description part; but the description is the giveaway.
Why do people bother creating such useless cr*p? Clicking the link tries to set multiple cookies, and triggers a large amount of popups.
Would never work. I don't believe artificial intelligence is advanced enough these days to recognize, say, poetry as English grammar, yet spam paragraphs as not. A filter that would block Shakespearean sonnets wouldn't be doing students a great service. And that's not even getting into modern poetry issues.
The differences between auto-generated spam and dadaist literature... there's a master's thesis out there for some enterprising soul... :-D