partyark - 7:38 pm on Jun 10, 2013 (gmt 0) [edited by: partyark at 7:57 pm (utc) on Jun 10, 2013]
One extra phenomenon - on widgetworld.com (which I'm really trying hard to remove from the index) there is a list of 'Crawl Errors' in webmaster tools.
This has a bunch of 410'd pages like I'd expect ... but it has got stuck. It hasn't added any since early May, when it reached 1,000, and the graph is completely flat-lined since that time. It looks like there's some sort of hard limit on the number of pages in error.
UPDATE: The failure to grow the list of 410s co-incides with adding a robots.txt directive to disallow crawling - that sort of makes sense. Except that it directly contradicts google's own guidelines on removing urls which state that a robots.txt entry is a good thing.
I'm going to try to allow crawling and see if I can get more pages added to the Crawl Errors list.
[edited by: partyark at 7:57 pm (utc) on Jun 10, 2013]