Anyone find it crazy that as a result of Panda we are having to evaluate our sites on a page by page basis to work out whether to block "thin" or lower quality pages so that the whole domain doesn't get a penalty.
Absolutely mental - how can non-webmasters deal with this. How do e-commerce sites with manufacturers descriptions and better prices deal with this.
It is not about content by itself in the commercial web.
Why can't Google just ignore the "thin" or listing type pages like category links, blank profiles, etc. etc.
Just give weight to the good stuff and ignore stuff that you can't score as quality (and that could be lower "quality" for other reasons that there aren't 500 words of unique content on it).
At the end of the day this "quality" recognition algorithm does seem to be based on lots of what is being discussed on this forum - but fixing it is near impossible for so many people.
I have recovered my penalised sites - I did this by buying aged PR 5/6 domains and putting unqiue content on them with price comparison tech. I am now doing this aggressively and not pursuing my "one site" objective where I create branded sites etc.
Surely this is the opposite of what Google really wants in it's index.