# Sites with a large number of outbound links in a list format (eg like Directories and Scrapers) - I would have thought that this includes links going through a redirect/cgi bin - G must be smart enough to work that out.
# Sites with content virtually identical to another site - eg Datafeed sites with virtually no unique content, or Newsgroups with no unique content (very very thin pages)
# ODP clones
Ok - some of the side effects of the above - normal directories will get hit (even ones with unique user submitted listing - the user submitting probably does not vary the text to much between directories), aswell as sites which have a large number of seemingly outbound links as page content.
Dayo_UK - this is the most logical theory I've seen put forth in this discussion.
Let me put a second vote in for the following: - large number of scraped/dup outbound links - including links being redirected, or opened up in frames - sites with content virtually identical to another site - sites with little modified content, consistant template, only switching out keywords
Let me say that the following are NOT the problem: - purely directory sites (do a google search and find millions of directories still online) - recip link directories (again, google it) If recip directories and link pages were being kicked off... practially the entire internet would have disappeared.
Does anyone have additions to, or problems with this list?
What are the pain thresholds for banning one site with these paramaters, but not another?