Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
However, with out setup, the old URLs will remain valid. While this is good for avoiding link-rot, it also results in orphaned pages. I'm considered that also because of the magnitude of these pages, Google will find a "small" minisite (a few thousand pages) and conclude that they still exist. I'm worried about being penalized.
Ever since we plummeted 2-3 pages after a weekend outage, we've been worried about being penalized in Google. What is the best way to fix this problem?
We're willing to ban googlebot in the robots.txt file for a month if that would clear the problem, however with the crawl starting days before the dance, we'd have to decide before seeing if our position returned naturally.
Is there an easy way to get Google to flush the existing URLs while still doing a fresh crawl of our site? Should we just eat the month and try to recover?
Getting creamed in Google became traumatic when our #2 Yahoo ranking died the other day, so I'm willing to do anything to get back in Google's good graces. I'm going to put the site in my profile, but you shouldn't have to look at it to help with the situation. We aren't PR0'd, so at most we hit a temporary spam filter that we can grow out of.