If they're "good sites," it's possible that there are enough mitigating factors to outweigh the keyword stuffing. A site that passes the sniff test in other respects may be viewed more charitably than, say, a made-for-AdSense scraper site or or thin-affiliate site with 100,000 computer-generated pages and 10,000 reciprocal links from sites of no intrinsic value. Why? Because in "grey area" situations (such as the number of times a keyword is used on a page) it makes sense to look at the overall picture. That's what a human reviewer would do, and if a search engine's alogrithm can replicate that kind of judgment through the use of different measurement factors and statistical probability, then good for the search engine.