Andylew - 8:24 am on Jul 7, 2011 (gmt 0)
The idea that a 'significant change' may prompt a different response to a reconsideration request has proven wrong. A site which has been completely removed and 404'd has still been met with 'site still violates quality guidelines' This adds further weight to the idea that reconsiderations are automated and rely on cached google data.
With a site 404'd googlebot disapears so pages are still in the index and cached and google doesnt realise they have gone.
So the problem is googlebot is needed to realise old violating pages have gone and 404ing a site looses the bot.
The next step is to completely change the url structure so there is still an active site, It would look like a new one with post violation content changes. This would tick the box of 'remove the pages which violate google quailty guidlines'
The idea in this case is that a new site would be recognised and pages would not have a black mark against them, black marked pages would be 404'd. The other conclusion may be that once a site has been penalised a site wide specfic penalty check algo is run on all new pages crawled. Time will tell.