Forum Moderators: open
Not sure, but I am having the impression that Google is becoming increasingly picky about duplicate content, and that having that many pages in the index that are - from Google's point of view - similar might harm my site instead of doing it any good.
Yet I feel that it is legitimate to have those pages spidered, because they do represent unique content. So I am wondering if there is a good method to feed Google just the "unique" parts of a site without boring it with the rest?
External CSS is good way to reduce on-page code bloat. Beyond doing that, I wouldn't worry too much. As I understand it, google actually stores content in the index it retrieves search from as plain text (no code/tags), so assuming the actual forum posts are at all unique, they'd look very different from each other in this index.
I'm not even as sold on CSS anymore, after seeing lots of nested table FrontPage-style code bloat rank well.