MikeNoLastName - 11:44 pm on Jun 6, 2005 (gmt 0)
What UK_Web_Guy says relates with my opinion. As any AI language programmer knows, if they already had a firm enough grip on what the algorithm had to do in order to weed out these sites to write this general document, then they would have done so and only used humans to catch SPECIFIC urls missed or accidently hit by it. To refine it. Perhaps this human created whitelist/blacklist is the extra .5 that GG was talking about. If so I would have expected them to put it in much sooner.