tedster - 4:16 am on Dec 19, 2010 (gmt 0)
The footprint of a parasite hosting hack isn't all that inscrutable for Google since they process such a large number of web pages. For one, the same content and links often get injected on many sites, not just one or two.
Also, it is also common that the content displayed to a googlebot user-agent is often different from the content displayed to a regular browser. And Google has been known to check up on web pages through methods that don't use googlebot.
mareng gave a good clue in the above post: "A site:.edu search for cialis gives a good view of the scope of the problem."
Indeed it does - over 4 million results, just for that one example! For some terms you can easily uncover entire pages of content that are orphaned from the regular website but backlinked from various link pyramids and rings. Sometimes you can even find an entire directory of parasite content.
This is not some small whim of Google's. The web has an epidemic.