| 2:48 am on Jun 13, 2006 (gmt 0)|
I believe that until most sites are compliant with WC3 the strict compliance does not matter.
It is more important whether your site is "compliant" with most other site real (white hat) practice.
| 2:55 am on Jun 13, 2006 (gmt 0)|
W3C will help you debug, and find "goofs" that could screw up your rankings. Of course, none of the big boys are w3c compliant.
| 4:35 am on Jun 13, 2006 (gmt 0)|
One simple mistake and you "fail". Incidentally, I just checked google.com and yahoo.com, and neither passed. G has something like 45 errors, and Yahoo has many more, according to w3c.org html validation check.
So I doubt it matters too much.
| 5:16 am on Jun 13, 2006 (gmt 0)|
I've seen no real evidence that validating pages affects page rank or SERPs positions.
BUT .. who is to say that it will NEVER be a factor?
There are people here whose pages didn't rank because of silly simple mistakes,
errors found and corrected only after they failed validation. -Larry
| 8:08 am on Jun 13, 2006 (gmt 0)|
Perhaps a bit off topic, but if you run Google Adsense on your site their code will not validate. I emailed Google about that and got the standard reply I could not alter their code. They don't validate. Just make it user and spider friendly.
| 1:13 pm on Jun 13, 2006 (gmt 0)|
I imagine that bad coding could effect some spidering or it could effect googlebot. I cleaned up some code on one of my pages I was having issues with yesterday and its in the index this morning. I had a bad <a=href tag
| 2:48 pm on Jun 13, 2006 (gmt 0)|
>>What is everyones thoughts about a WC3 compliant site?
Do it for your visitors, not the bots. I make sure my pages validate and they are consistent in appearance with common browsers (IE, Firefox, Opera).
In my experience it gives little, if any, boost in rankings.
| 2:57 pm on Jun 13, 2006 (gmt 0)|
In my sector, almost none of the top ones are compliant. Heck, most of them don't even have a doc type. Mine all are compliant on everything I build. But that is for me.
BTW..being compliant won't always catch everything. Even the w3c checker misses stuff. Run xenu linksleuth to be double sure all your links work right.
| 4:39 pm on Jun 13, 2006 (gmt 0)|
I recently did a complete website upgrade, and lost my ranking within 5 days completely. My upgrade only replaced the old design, while keeping all the content in its existing place.
The previous design was your standard validated html 4.01 transitional and css. After the makeover the site now validates xhtml strict, css, and aaa, which I’m pleased.
Following the upload of the new site… 5 days later Google knocked the site completely out of its index. I refused to panic. Two days later sure enough the site showed backup in the index, but pages back from the front. Today (2 weeks later) the site is back on the front page, but sits at number 5 to 8 position. So I imagine I will gain my number 1 slots soon again.
It's been while since I've posted here, but thought my experience was worth sharing.
To answer your question though… I have always believed in meeting the standards as much as possible or you can afford. If one is building a site for the long term, why wouldn’t you build it right for longevity?
| 4:57 pm on Jun 13, 2006 (gmt 0)|
I think not validating code is a bit like doing a job and not cleaning up after yourself. Regardless of whether it helps serps or not, doing things right can only help in the long run.
| 5:29 pm on Jun 13, 2006 (gmt 0)|
Having valid markup will not in itself help ranking - but conversely invalid markup can have a negative effect.
Perhaps you should really ask the question about how many errors you can allow in your HTML you can have before it becomes a problem? For example, mis-matched (incorrectly-nested) tags can lead to sections of text being ignored by the spider.
| 6:29 pm on Jun 13, 2006 (gmt 0)|
Or links as my case was.
| 9:01 pm on Jun 13, 2006 (gmt 0)|
For sites that don't use META descriptions but make extensive use of nav menus that appear before content on source, depending on whether content is in P or H tag is wrapped in A HREF (or if an H tag accidentally goes unclosed), Google will miss the content completely and snippetize your nav links or copyright text instead, making it more likely that those pages turn supplemental due to identical description snippets. BTW, even if you use META descriptions, if its shorter than ~50 chars, Google will fish for text in the source.
| 12:30 am on Jun 14, 2006 (gmt 0)|
Bad code could hurt you. I see plenty of evidence for that with malformed title tags, or malformed links.
Good code might help a little, or may have no effect.
I don't take the risk. I check the code for errors and fix them all.
| 12:43 am on Jun 14, 2006 (gmt 0)|
The bad href tags I found prove that point. I wonder how it is all working with big daddy and xhtml.
| 1:19 pm on Jun 14, 2006 (gmt 0)|
Wow, cleaning up the html really helped the serps for my site. I noticed today I shot to the top for certain keywords. Google was struggling with following links due to bad href tags and now the pages are all indexed and ranking very well. I also saw a large boost in ranking in yahoo as well.
Lesson well learned, these were recently dropped pages and clean html helps ranking....
| 1:28 pm on Jun 14, 2006 (gmt 0)|
Every page that I have ever cleaned up the code on, has risen at least a little within a few days to weeks.
I would also suggest running Xenu LinkSleuth over you site and checking for consistency of linking and anchor text format, and so on. Look carefully at the error list and the reports, and then scan your eye over the generated HTML sitemap for other obvious problems too.
| 1:52 pm on Jun 14, 2006 (gmt 0)|
I build everything to XHTML 1.0 Strict.
In my opinion following standards will do more for your website in search engines than you would imagine...
| 4:15 pm on Jun 14, 2006 (gmt 0)|
I validate all pages that I have designed/redesigned and that covers about 32 client sites. I can't say that has helped ranking but I have noticed that a page will drop in rank if I failed to validate and there were errors.
However one thing I have noticed that affects ranking drastically is protecting your sites so hijackers and scrapers have less of an effect and that involves making sure all of the following are in place:
Dedicated IP address
Pop out of frames script
Full URLs for all internal navigation
301 redirect non-www to www version of the domain
| 2:04 pm on Jun 19, 2006 (gmt 0)|
Making sure pages are compliant has helped me bust top 10 in the serps (Google,Yahoo, MSN) for almost every keyword for that particular site this week.
I am beginning to work on my other sites today to get them compliant as well.