Welcome to WebmasterWorld Guest from

Forum Moderators: mademetop

Message Too Old, No Replies

How HTML Code as W3C standards affect a sites ranking ?

8:42 am on Jul 13, 2008 (gmt 0)

Full Member

10+ Year Member

joined:Oct 26, 2007
votes: 0

Does anyone know how html code as W3C standards affect a web sites ranking ?



10:20 am on July 13, 2008 (gmt 0)

Preferred Member

10+ Year Member

joined:July 25, 2006
posts: 460
votes: 0

If the pages are severely broken, such as having multiple <head> or <body> tags, it could conceivably affect crawlers ability to determine what the page content really is, and thus indirectly affect ranking.

Other than that, however, W3C validation is considered by most not to be a ranking factor at all, and many of the world's most popular and highest ranked sites don't validate.

Validating pages can be sort of fun, and it does give you some confidence that your pages will look basically ok in any browser, but if you're considering embarking on a massive validation campaign only because of an expectation that it will improve ranking, I believe it is wasted effort.

Edit: Something else that could fall into the category of "badly broken" is text placed on pages by JavaScript such as document.write(). Although the text is in the JS, it doesn't become part of the page until it runs, and crawlers don't run JS. If a page is constructed entirely with document.write (yes, I've seen entire sites constructed that way), search engines consider them blank pages.

The same used to be true of Flash content, but there are recent reports that Google is going to start trying to read Flash.

[edited by: SteveWh at 10:36 am (utc) on July 13, 2008]

9:41 pm on July 13, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 4, 2002
votes: 0

"severely broken" may be as little as missing an open quote or a close angle bracket. These sorts of errors can throw spiders off the track when indexing a web page.

You should never assume that spiders are as good at reading pages as browsers are. The browser has to produce a rendering acceptable to a human being. But the spider "simply" has to read and process millions of pages a day as fast as it can. Statistically, it can fail on many pages and still be working at 99.9999% accuracy.

You never know when Google or others are going to try out a beta version of their new spiders. And you never know what HTML coding errors will trip them up.

Best not to take the risk. Use HTML generators that work.

11:06 am on July 14, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
votes: 0

There's a fairly well-known SEO whose "contact" page fails to display in either Opera or Safari because of a simple error like that, and it has been that way for over a year now.
1:27 am on July 22, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:June 6, 2008
posts: 97
votes: 0

Valid markup can't hurt but you'll be hardpressed to find many sites that pass. Heck, Yahoo.com doesn't.


Google.com doesn't either:

Apple.com fails:

Minor issues may not hurt you. But, they could. Ideally you want valid markup and CSS. The more scripts running on your site the more difficult it becomes to acheive this. From past experience, the shopping carts I have worked with have been the most difficult to validate because some developers don't see the benefit and prefer not to spend the extra time on minor details. (Note: some carts are excellent and I am only speaking in regards to the systems I have worked with).


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members