deelerdave - 3:32 am on Aug 16, 2005 (gmt 0)
In Msg 545 I described adding hundreds of pages of duplicate content to my site. This was a php script that I thought would not result in indexed web pages, but many have shown up in Google and my Google traffic went to near zero June 16. Can I just delete these pages from my site, or do I need to clear googles cache of them also?
If the latter then should I put noarchive in the robot meta tag and allow google to spider my site for a while before I delete the pages? When I delete them, should I literally remove the pages, change the permissions to prevent access, or change the root folder of the pages to disallow in robots.txt? Or is all of this a waste of time and I should start fresh with a new domain on a new ISP?