Forum Moderators: goodroi
I have a question i would like to table to peeps
I have a site with hundreds of folders each containing 400+ duplicates of 1 page.
eg:
domain.com/folder1/index.html, 1.html, 2.html, 3.html, etc
Firstly:
IF all 400 pages are all being listed across many websites will this cuase a problem/penalty.
Secondly:
Is there a way to mod robots.txt to exclude all but say domain.com/folder1/index.html
Also is it possible to direct the search bots to index.html from all teh other pages?
Thanks in advance