Outside of updating my robots.txt, does anyone have any suggestions that will help prevent Googlebot from requesting old and incorrect links. Over the past 2 weeks, I've received 100+ page request from Googlebot which is causing my memory cache to fill due to excessive page execution errors.
Is any one else suffering from the same