1script - 4:00 pm on Sep 9, 2011 (gmt 0)
Let's talk when your sites disappears permanently, shall we? What you say may all ring true in theory and in an ideal world. In practice though, nothing is ideal. The algo is written by people. People make mistakes. The ban is handed out by people. People make mistakes. A reviewer might have a bad day or clicked a wrong button. I'm not saying that's what's happened, I'm saying it's entirely possible.
I've spoken to people who had their sites disappear and their businesses collapse without any hope of returning. It happens, and it's only right.
Also, all that talk about walking the edge: where does that come from? You sound like someone who knows exactly where the edge is. Me, personally: I'm not so sure. In all these years my "SEO efforts" meager as they were, were all targeted at avoiding making a big mistake. Canonical issues - check! Crawlability-check! etc. I thought that was safe - much safer than to push the boundaries, which is what you describe. I never wanted to find the edge - it found me on its own accord.
Anyhow, getting back to the topic of this conversation: a silver lining in getting banned from Google must be in that now, with Googlebot gone, I have TONS more bandwidth and CPU for the actual users. The sites are flying - a delight to browse them, I have not seen them work this well in years.
The ratio of bandwidth consumed by Gbot to the traffic received from Google is not looking very favorably for Google. It's about 1/5th of the ratio for Bing for the one site I checked. I'm going to do more research on that - maybe there are some interesting conclusions to be made here. Like throttling down Gbot access on other, not yet banned, sites.