In part answer to my own question:
I used W3c's own link checker and it found no broken links, handling the base ref properly, as did another online site-wide link checker, however, I just tried an online site analysis tool at random.
It is on G's first page of results.
And it (falsely) claimed all my internal links were broken.
They work fine, but the analysis tool simply added the internal link to the base ref url thus:
Resulting in a 404.
So there's one clear example of a bot getting confused by this. The first one I've noticed.
NB: I use the full page url in the base href and root-relative internal links.
PS: thebear: Using CSS and ditching tables dramatically improved our load times. (Just scored 10 out 10). Also we lifted the page text to the top of the html file so it gets looked at first by the bots.
But I understand what you mean about multiple external css files and overdoing the styling.
You clearly have more confidence in SEs than I do. :)