| 12:48 pm on Dec 15, 2007 (gmt 0)|
Add a forum sitemap with all thread URL;s and submit it to Google.
| 8:54 pm on Dec 15, 2007 (gmt 0)|
Thanks Ning, any online guides showing me how to do that?
| 8:54 am on Dec 17, 2007 (gmt 0)|
Forums, at least my experience with phpbb, are a minefield of duplicate content and uneeded content. To give you an example I had a forum with soemthing 12,000 pages indexed where the actual count of real content excluding profiles and the like was probably more like 1000. The bots simply get overwhelmed with and have no idea where to go..
If it's viewing PM's and profiles then you must not have denied it access to those pages with robots.txt That's the first place to start.
You may also note that if you're using phpbb2 or 3 that is only the tip of the iceberg, phpbb2 has upwards of 10 URL's per actual page just through pagination and other features. If you let it into the search page you're looking at some ridiculous amount.
phpBB3 is a little better and hides some content from bots but still has about 5 duplicate URL's per page.
| 9:16 am on Dec 17, 2007 (gmt 0)|
Thanks coalman do I need to put this into the robot file?
| 10:05 am on Dec 17, 2007 (gmt 0)|
Yep, you can use the robot.txt file to restrict access to some URL.
|Thanks Ning, any online guides showing me how to do that? |
Just try the Google help from Google Webmaster Tools, there is a long explanation on how to create sitemap in XML.
| 12:42 am on Dec 18, 2007 (gmt 0)|
If you want to stimulate deep crawling by Google, try to improve external linkage. Good links to your forum home page will help, as will some links to individual forums and threads.
| 12:51 am on Dec 18, 2007 (gmt 0)|
Strong pagerank for the main domain, simplified navigation, and sitemaps work well for Google. Have yet to figure out how to get Yahoo and MSN (even with sitemaps) to deep crawl.