---- Gone from SERPs for all keywords - after dropping an affiliate script
tedster - 6:38 am on Jul 1, 2009 (gmt 0)
300+ “Not found” crawl errors for the affiliate store's urls
Is there some easy pattern with those urls that would let you create one simple disallow rule in robots.txt? If those urls once resolved, then Google will "remember" them for a long time - but a robots.txt rule will let you handle the issue more simply.