Forum Moderators: Robert Charlton & goodroi
Currently, I have the individual forum posts (not the full topic posts) denied to the bots via the robots.txt because these links are not true individual pages.
They are, for lack of better words, locater links, such as: “sutra109818.html#109818” which just takes you to the location of that post on a page of 10 posts.
So these links truly look like duplicate content to Google, as I am sure that Google is reading the full page content which is the same as the full topic link.
But I wish to make the “sutra109818.html” an actual page, not just a “locator link”. In doing so, I will SEO that page with the title of the post in the URL, which currently I am not doing for the full topic link.
Why do I want to do this?
Because the scrappers that are scraping my RSS files (which lists the individual posts, not the full topics) are doing very well in the SERP’s, mainly on Yahoo, but also in Google.
So I wish to bet them at their own game and would love to add 100,000 pages of content to my site.
Also, by adding the title to the link url, I believe this will give a little (very little) added weight to the pages in the SERPs, but just as important, more key word eye candy for the links in the SERP’s to the end user.
But by doing this, I will be adding 100,000+ snippet pages to my site.
My concern is, will adding this much content at once hinder my site?
Currently Google shows about 50,000 pages indexed, which is about right because I have done a very good job of denying via the robots.txt all of the useless links on a site such as this.
So by making the sutra*html a true page, I will be adding 100,000 plus pages at once to my site.
I wish to do this, but I am concerned.
A perfect example is:
< Found
< Right
< Here (on your left)
I wish to make this post (#:3360852 ) it's own page, and every post on this page will become a small snippit page, not just full thread url of 3360808.htm
Are you following me?
[edited by: kamikaze_Optimizer at 7:13 am (utc) on June 7, 2007]