Welcome to WebmasterWorld Guest from 54.226.62.26

Forum Moderators: mademetop

Message Too Old, No Replies

How HTML Code as W3C standards affect a sites ranking ?

     

malcolmcroucher

8:42 am on Jul 13, 2008 (gmt 0)

5+ Year Member



Does anyone know how html code as W3C standards affect a web sites ranking ?

Regards

Malcolm

SteveWh

10:20 am on Jul 13, 2008 (gmt 0)

5+ Year Member



If the pages are severely broken, such as having multiple <head> or <body> tags, it could conceivably affect crawlers ability to determine what the page content really is, and thus indirectly affect ranking.

Other than that, however, W3C validation is considered by most not to be a ranking factor at all, and many of the world's most popular and highest ranked sites don't validate.

Validating pages can be sort of fun, and it does give you some confidence that your pages will look basically ok in any browser, but if you're considering embarking on a massive validation campaign only because of an expectation that it will improve ranking, I believe it is wasted effort.

Edit: Something else that could fall into the category of "badly broken" is text placed on pages by JavaScript such as document.write(). Although the text is in the JS, it doesn't become part of the page until it runs, and crawlers don't run JS. If a page is constructed entirely with document.write (yes, I've seen entire sites constructed that way), search engines consider them blank pages.

The same used to be true of Flash content, but there are recent reports that Google is going to start trying to read Flash.

[edited by: SteveWh at 10:36 am (utc) on July 13, 2008]

victor

9:41 pm on Jul 13, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"severely broken" may be as little as missing an open quote or a close angle bracket. These sorts of errors can throw spiders off the track when indexing a web page.

You should never assume that spiders are as good at reading pages as browsers are. The browser has to produce a rendering acceptable to a human being. But the spider "simply" has to read and process millions of pages a day as fast as it can. Statistically, it can fail on many pages and still be working at 99.9999% accuracy.

You never know when Google or others are going to try out a beta version of their new spiders. And you never know what HTML coding errors will trip them up.

Best not to take the risk. Use HTML generators that work.

g1smd

11:06 am on Jul 14, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



There's a fairly well-known SEO whose "contact" page fails to display in either Opera or Safari because of a simple error like that, and it has been that way for over a year now.

Excellira

1:27 am on Jul 22, 2008 (gmt 0)

5+ Year Member



Valid markup can't hurt but you'll be hardpressed to find many sites that pass. Heck, Yahoo.com doesn't.

[validator.w3.org...]

Google.com doesn't either:
[validator.w3.org...]

Apple.com fails:
[validator.w3.org...]

Minor issues may not hurt you. But, they could. Ideally you want valid markup and CSS. The more scripts running on your site the more difficult it becomes to acheive this. From past experience, the shopping carts I have worked with have been the most difficult to validate because some developers don't see the benefit and prefer not to spend the extra time on minor details. (Note: some carts are excellent and I am only speaking in regards to the systems I have worked with).

 

Featured Threads

Hot Threads This Week

Hot Threads This Month