Forum Moderators: Robert Charlton & goodroi
I am working on a vbulletin forum, for which I have already done quite a bit of seo work. I have pretty muched blocked any duplicated urls from being indexed in robots.txt. When I do a inurl:forums site:www.example.com, I get 611 urls indexed, most of which are the topic threads, but when I start clicking through them after 22 urls I get the dreaded - "In order to show you the most relevant results, we have omitted some entries very similar to the 22 already displayed."
All of the pages have unique titles and meta descriptions, however Google doesn't seem to be using the meta descriptions on a lot of the pages. On some they are though.
Does anyone have any other ideas for how to get these unique pages indexed properly?
[edited by: engine at 1:18 pm (utc) on April 17, 2008]
[edit reason] examplified [/edit]
If I had to guess, I'd say that most of these urls are either 1) too recent for the ful treatment, or 2) judged as being lower importance.
We've got a number of threads about forums and duplicate content - you may want to check in case you've missed some detail or other. Here's a solid one:
[webmasterworld.com...]