Your argument is mostly about semantics, and I just stated what's my position about semantics in my previous post.
On the other hand, the point about bots punishing content based on tag-hierarchy dept is worth a look at:
Another issue with it that hasn't been mentioned is how many nodes does a bot have to traversed before getting to the content of the page?
<table> 1 node <tr> 2 nodes <td> 3 nodes <table> 4 nodes <tr> ... an so on.
following on... the next <td> would be 6 nodes deep. You are using an example of, at least, two levels of table nesting, about which I'd like to make two comments: 1st: if a layout really requires nesting tables (remember that you can use the colspan and rowspan attributes), how many nested <div>s would you need to achieve the equivalent layout via CSS? (This not to mention how complex might the CSS be). 2nd: where I said
As long as you keep your head over your shoulders
I mostly meant "as long as you aren't nesting tables for layout". I'll try to make this clear: using a table for layout and, inside it, some tables for tabular contents seems fine to me (at least in some contexts); but if your layout really needs nesting tables then you should consider what kind of design are you doing (regardless of using tables or CSS). Remember that webpages do not come with a 400-pages user manual: keep it simple.
Besides the issue of nesting, your statement about bots punnishing these usages (which have been the norm for serveral years) seems quite speculative. Can you point to some factual data supporting it? (Either an example of a site affected by this, or some statement by a search engine mentioning this). Personally, I really think that a semantic approach should help bots to understand a site's contents, and hence help on rankings; but I abstaining from mentioning it because it's rather a personal impression, quite hard to prove or even argue about.