Welcome to WebmasterWorld Guest from 22.214.171.124
BUT .. who is to say that it will NEVER be a factor?
There are people here whose pages didn't rank because of silly simple mistakes,
errors found and corrected only after they failed validation. -Larry
Do it for your visitors, not the bots. I make sure my pages validate and they are consistent in appearance with common browsers (IE, Firefox, Opera).
In my experience it gives little, if any, boost in rankings.
The previous design was your standard validated html 4.01 transitional and css. After the makeover the site now validates xhtml strict, css, and aaa, which I’m pleased.
Following the upload of the new site… 5 days later Google knocked the site completely out of its index. I refused to panic. Two days later sure enough the site showed backup in the index, but pages back from the front. Today (2 weeks later) the site is back on the front page, but sits at number 5 to 8 position. So I imagine I will gain my number 1 slots soon again.
It's been while since I've posted here, but thought my experience was worth sharing.
To answer your question though… I have always believed in meeting the standards as much as possible or you can afford. If one is building a site for the long term, why wouldn’t you build it right for longevity?
Perhaps you should really ask the question about how many errors you can allow in your HTML you can have before it becomes a problem? For example, mis-matched (incorrectly-nested) tags can lead to sections of text being ignored by the spider.
Lesson well learned, these were recently dropped pages and clean html helps ranking....
I would also suggest running Xenu LinkSleuth over you site and checking for consistency of linking and anchor text format, and so on. Look carefully at the error list and the reports, and then scan your eye over the generated HTML sitemap for other obvious problems too.
However one thing I have noticed that affects ranking drastically is protecting your sites so hijackers and scrapers have less of an effect and that involves making sure all of the following are in place:
Dedicated IP address
Pop out of frames script
Full URLs for all internal navigation
301 redirect non-www to www version of the domain