jmccormac - 4:48 pm on Nov 7, 2010 (gmt 0) [edited by: jmccormac at 4:52 pm (utc) on Nov 7, 2010]
I frequently get the message from Google about one of my sites. It has the hosting history for every domain name in com/net/org/biz/info/mobi/asia back to 2000 and the stats for nameservers over the same period. So with approximately 300 million or so pages, John Mu's advice seems to be the best. Naturally I haven't put all of these pages in the site maps but there is still a relatively large number. From search engine development work, broken rewriters can often cause recursive page structures (where the same page content is served with a load of different URLs) and this is one of the things that Google may be trying to avoid. The first thing to check is that any rewriter is working properly and also check the numbers of pages in your sitemap files. Prioritise the important ones and freeze the ones that never change at a lower priority/importance.
[edited by: jmccormac at 4:52 pm (utc) on Nov 7, 2010]