Forum Moderators: buckworks
I suggest that you don't do it or you could ban the spiders from the new pages with the duplicate content.
Do you still think we shouldn't do it?
That all depends upon the method used and the spider you're talking about. If you use your htaccess file correctly you can block anyone from getting access to one or more pages of your website. Using nofollow or robots.txt is an iffy proposition at best.