Msg#: 3528422 posted 8:54 am on Dec 17, 2007 (gmt 0)
Forums, at least my experience with phpbb, are a minefield of duplicate content and uneeded content. To give you an example I had a forum with soemthing 12,000 pages indexed where the actual count of real content excluding profiles and the like was probably more like 1000. The bots simply get overwhelmed with and have no idea where to go..
If it's viewing PM's and profiles then you must not have denied it access to those pages with robots.txt That's the first place to start.
You may also note that if you're using phpbb2 or 3 that is only the tip of the iceberg, phpbb2 has upwards of 10 URL's per actual page just through pagination and other features. If you let it into the search page you're looking at some ridiculous amount.
phpBB3 is a little better and hides some content from bots but still has about 5 duplicate URL's per page.