kaled - 3:27 pm on Dec 19, 2010 (gmt 0)
Having briefly checked the site:.edu cialis search is seems to me that some of the results are based purely on inbound links but others are definitely the results of some cloaking hack or other - i.e. a Cialis promoting page is delivered to googlebot whilst users get the standard page.
For such pages the suggestion that google should simply delist affected pages is dumb because they are effectively delisted already. They may show up in searches as a result of inbound links but they will not show up as a result of content - and I seem to remember hearing something about content is king and something else about on-page SEO.
In life, you have to look at the alternatives. For Google, the alternatives are
1) Do nothing, let the spammers/hackers play their little game at everyone else's expense.
2) Contact the webmaster if possible.
3) Inform users in the hope that the webmaster will get wind of it eventually.
I slightly favour 2) but I think it comes down to the wording of the message with respect to 3) and maybe the detection method used.
For instance, if Googlebot roams the net incognito in order to check for cloaking (pretty reasonable I would say) detection should be fairly reliable - false positives should be rare provided they filter for common spam words as well as major cloaking.
Perhaps if people feel strongly about this they might offer suggestions as to alternative wording, etc of the warning message - Google may well be receptive to constructive criticism.
That said, I think contacting the webmaster via whois data is a better solution - and if the problem has not been rectified within 30 days (or no valid whois information can be found) then inform users of the problem.