Forum Moderators: Robert Charlton & goodroi
Take care not to index the same content more than once. - I fell in the the suplimentaries with my site because of doing that, took months to get out.
Best is to use your robots file to filter all possible duplicates. For me most dups came from action= in the url (reply, quote, print, etc)and I put disallow action in the robots txt.
I bought a good program, it was worth the small amount. Online free ones *may* leave your site open to site scrapers.
Google now indexes my site thoroughly with old threads being as well hit as new ones in google. Its well well worth the time and effort in creating the sitemap.