I'm not sure if I'm reading that correctly, but...
I'm willing to bet that an algo can in fact easily identify shopping cart directions, and most certainly <img> tags. Something like that can certainly be accounted for. Unless you're talking about the ability to identify 'who is this picture of'.
Also, if there is in fact a -30 penalty that is manually applied, it could be something as simple as -
Writing a script to list all the top 15 sites for a previously specified selection of search terms. (could be generated via another program, or by hand) Then, remove all sites that fall into #*$!x parameters. (could be shopping cart based, or whatever. Pick your poison) The ones that are left are used to fix the natural search... just in time for the shopping season I might add.
OR - It could be someone stating their claim like I did with google, on the fact that some people are ranking for terms where the user is clearly NOT looking for anything related to their site.
However, someone did say that they have a -30 to an entire directory... which would totally suck. Go back and see if your site ranks for any natural terms, especially ones like "Chicken information" or whatever. Just see if you can get in the top ten for at least something.
If not, then it's worse than speculated. More like a ban from the top ten, rather than an adjustment.
Other things might come into play. Scraped content (real big no-no recently), meta issues, linking profile, etc. Hundreds of things really. I'd take a real long hard look at your SEO practices before blaming google for wronging you.
[edited by: tedster at 9:16 pm (utc) on Oct. 30, 2006]