The excuse we are most often given when pointing-out ads or sites that seem to violate the rules is that the ad probably hasn't been reviewed yet.
But is a manual review necessary to catch most of the problems?
Here's a good example I noticed today: affiliate keyword-stuffing in URLs. That is, stuffing keywords in the URL of a direct-affiliate link, creating a display URL that does not actually exist on the site.
How hard would it be to check for a 404 error?
It's unfair to those of us who follow the rules, and significantly decreases ad quality. (Why? Because no matter how creative your ad is, a scofflaw can go one-better by stuffing keywords in the URL.)
Same goes for redirects that are used by some advertisers to unfairly get more than one ad displayed per page, or by affiliates to get their ad displayed when their is already an affiliate ad on the page.
Certainly, this check can be automated.
What other easily-automated checking could be done, but isn't being done now?
1.) Trademarks. I've successfully registered trademarks with Google, yet - quite often - I'm reporting abuses of my trademark by others while self-use of my own trademark is disapproved. Once a trademark protection is approved... seems rather easy to monitor via a white/black-list.
2.) Editorial policies. Expanding on aforementioned issues... craftier "advertisers" seem to find a way around repetition, use of symbols, superlatives, working back button (and more) rules which seem quite easily to regulate.
3.) Cloned advertiser sites. Regulating "fake affiliate" sites managed by the advertiser to dominate select terms. Some clones are amazingly obvious...
4.) Syndication Partner Quality. If Google is taking a stab at measuring the relevancy of advertiser landing pages in an automated fashion, surely they can find a way to monitor and remove MFA search syndication partners that egregiously violate the guideline of - "Provide relevant and substantial content."