incrediBILL - 12:45 am on Nov 3, 2012 (gmt 0)
Google tried to explain to them there's no way to do that within the algorithm, but Congress' responses indicated they either didn't believe 'em, didn't get it or didn't care.
I don't believe them and even if you can't automatically figure it out, you can manually do it.
Worse case all new domains need a manual review to comply with the law and the next thing you know it'll become paid inclusion which would actually stop a lot of the nonsense sites form being indexed in the first place because they won't pay their hard earned money for a domain that will most likely be abandoned in very little time.
The problem they face is that sites authorized even in a paid inclusion environment can simply change their content after being included and VOILA! you're indexing bad sites all over again.
What's going to happen is the algo will have to store a profile about each site included and if the site no longer matches the profile it will kick it back to a human to review.
In the end, thanks to frivolous litigation and bad judgements, we're back to the same situation we had with Yahoo paid inclusion, DMOZ and all the rest and perhaps that wasn't such a bad thing because it employed a lot of people manually checking sites although getting new sites listed could take weeks or months and was quite maddening.
I knew someone at ground zero in the Yahoo Directory so I just dropped an email and got moved to the top of the heap and instant approval, those were the days! Knew someone at DMOZ too! :)
Maybe the simple solution is don't index anything that doesn't have a valid basic SSL certificate and put the onus on the SSL certificate sellers to sort it all out with validation and verification.