Forum Moderators: open
Was wondering if anyone could offer an opinion of the significance of http errors in google rankings - i have a small site that appears to look and function fine from almost every browser / OS I've used. Yet, when testing the site's http code syntax on NetMechanic, our pages show a lot of minor errors. (e.g., ]^Error: missing </font> end tag before <table>). Our tags, however, seem to be fine. Do these code syntax errors make any difference if users can't see them? are these errors somehow picked up by robots / google's ranking algorithm and factored in?
Would not worry too much for small errors here and there, as long as your links can be deeply read by fresh misses Googlebot..
The only data point I'd add is Eric Brewer's '96 paper that mentioned 40% of pages have actual errors in the pages
from Does Google reward valid code? [webmasterworld.com]
why is it that I tend to remember what Googleguy posts? ;)
"It is plausible that pages with many mistakes in the markup are more likely to be of lower quality than pages with no mistakes"
happy validating! ;)
It is important to sort out HTML typos, unclosed tags, nesting errors, tags closed in the wrong order, wrong attribute values, unquoted attribute vaues and so on. It will make your job easier in the long run if you keep your code as well-formed and as valid as possible.
I usually tick the Show Source and Verbose Output and especially the Show Outline options.