Forum Moderators: open
Today I revalidated the site, and as usual the errors were few and trivial (lots of unescaped apmersands in pasted URLs, several  's with no trailing semicolon). No big deal, browsers know how to deal with these errors.
But checking the links showed a big mistake. I have an include file which has links to the main sections of my site, assumed to be in the same directory as the file that calls the include. Recently I added a subdirectory (the first one) and all the files in it used the normal include. Needless to say Xenu was unable to find any of those links!
Of course this should have been detected much sooner. Major changes (including additions) should always trigger a rechecking (validation plus links). Just thought I would share the experience, it is less painful to learn from the mistakes of others than from one's own :)
lots of unescaped apmersands in pasted URLs, several  's with no trailing semicolon). No big deal, browsers know how to deal with these errors.
That is no reason to let these errors through.
The browsers you tested might have been able to 'deal with these errors' - but what about all the other browsers out there? And possibly more importantly what about search engine spiders?
That is no reason to let these errors through.
Obviously, when I found them, I corrected them.
For those who view validation as a practical tool, rather than as a theological imperative, there is a hierarchy of errors. I maintain that a broken link is a more serious error than an unescaped ampersand :)
It would be a spider with a parsing algorithm that couldn't handle non-terminated escape sequences. If you have ever written a parser you'll know that dealing with errors is often the hardest bit - especially if you need to recover from them and carry on.
Basically the more errors you have, even simple ones, the more likely it is that a spider will just give up on your page.