Forum Moderators: Robert Charlton & goodroi
Having said that, most of the new sites that I have created recently were fully validated and all of them have been sandboxed so who knows?
Help you with search engines, probably not.
I was shocked at the results. Not ONE in the top 10 validates. Only 2 out of the top 10 even had a doc type. Most had a very high number of problems.
An excellent experiment would be to take that site in the #6 or #7 position and clean up the code (validate it). Give it about 30-60 days and see what happens.
Keep in mind that less than 5% (estimate only) of websites will pass a 4.01 Traditional validation.
My point is with the results I got I am wondering if google isn't looking at validation as seo. Maybe even over optimization.
I don't think I would consider Google looking at validation from an SEO standpoint, it doesn't apply really.
I look at it from a different perspective. How clean of a path can I provide the spider in getting from point A to point Z? Is my content being indexed properly and completely. If I have errors, it may not be. So, eliminating the errors makes that path a straight shot. No detours, no guessing, no problems in the html. The first step in optimizing your html. Once you've done that, then you can concentrate on the SEO stuff.
Since I make sure mine validate, this might explain why they do so well in those se's.
I would still like to see a sector where the top ten in google was populated by validated sites predominantly. Until then, I am concerned as to it being overoptimization.
I am starting to wonder if google sees validation as seo work and discounts the page.
why shall they? they want good, not always strictly e-commercial sites to be at the top, just the most relevant ones. the latter may be built by my grandma and i doubt she would care about validation.
would be illogical for me if validation meant anything.
I am concerned as to it being overoptimization.
I'm curious to know what over optimization is in this instance? If its a highly competitive industry you are looking at in Google, there is the strong possibility that links are the determining factor here. ;)
Okay, so now we have 10 sites that all have strong inbound links. None of them validate. Let's take the worst one (in positions 5, 6 and 7) and validate it to see what happens. We'll remove all of the presentation markup and move that into an external stylesheet. In turn, that will reduce the html to text ratio of the page. In turn, that will increase other on page factors. Which in turn should cause that page to possibly move up a notch or two. That's just a wild guess on my part. ;)
Unfortunately that won't happen, unless of course it is one of your sites and you are following this topic and know what sector is being reviewed. ;)
If you think about it, the pages that are most likely to validate are those of web professionals, not the pages of those that are the experts in the different sectors that are just trying to put up a page with the information. The only validation that those people are likely to do is make sure that it looks okay in whatever browser they use.
As a searcher, I don't care if a website on secret fishing spots has HTML that validates, I want a site where the secret fishing spots validate.
Google.com doesn't validate
>>>If you think about it, the pages that are most likely to validate are those of web professionals<<<
I think both of these statements prove my point. Only web professionals validate, so it might trigger an algo.
[edited by: texasville at 8:31 pm (utc) on Nov. 28, 2005]
Wow! That was a really quick experiment. Care to share with us the steps that you took? ;)
nope.
...oh ok. Any jumps in your ranking is more likely to do with the many other factors around. I don’t believe its worth using valuable processor time to do such an intensive task of validating the web ahead of downgrading pages with many broken links which is not yet done. I have converted very static PR5 HTML that would get many back links to use XHTML valid. The effect was as predicted – nothing. Same goes for a number of other sites.
Valid HTML requires skilled users to build professional accessible websites. The Web is not just about standards for these skilled users, it’s more about the little guy who can compete with the big guns.
[edited by: Johan007 at 8:30 pm (utc) on Nov. 28, 2005]
I have converted very static PR5 HTML that would get many back links to use XHTML valid. The effect was as predicted – nothing. Same goes for a number of other sites.
Not really a valid test, I think. Did you also correct the semantics in your markup? Or, did you just remove some presentation attributes to squeeze by the validation?
Validation is just one part of the equation.
Only web professionals validate, so it might trigger an algo.
Only web professionals are likely to run validation software, but that does not mean that other pages will not validate. Many of the WYSIWYG editors produce web pages that usually validate, as do many CMS programs. Not to mention, many of those web professionals work at authority sites that produce great content.
It would be just as stupid to trigger on validating HTML as it would be to trigger on pages that do not validate. And contraty to what you might read around here, Googlers ain't stupid.
Not really a valid test, I think. Did you also correct the semantics in your markup? Or, did you just remove some presentation attributes to squeeze by the validation?
Erm... please who do you think you are :)
No its not a scientific test. But it was a table messy website before with invalid tags it is now a WAI Double A accessible standard, CSS valid and table less website with changeable CSS style sheets. The mark up was changed to use <li> etc.. for menus, H1, P, acronym tags the works! I could go on.
And yet it had zero effect on all SERPS.
What google cares about ranking is the information that is shown to the user. Whether it validates is of the same importance as keywords in HTML comments, it is ignored.
As someone else mentioned, why waste the processor time validating when it is something that simply doesn't matter. There are a lot more telling factors to look at that would be nowhere near as expensive.
Texasville, I don't buy it. How can you get a rankings boost from not paying any attention to how your code looks? Doesn't make sense to me.
[google.com...]
Do you think that particular error would have any impact on ranking?
I do.
I would still like to see a sector where the top 10 is predominantly validated sites.
Since most sites don't validate that would not make much sense and would like be a huge exception.
I know you think it would make you feel better, but I think you should relax. It is HIGHLY unlikely google is handing out penalties for having a site that validates.
That being said, as others mentioned, I like to try and prevent the search engines for tripping up, but I have never tried to get a site to validate 100%. Not like a new browser comes out every day that I need to future-proof my site now. I will have plenty of time to respond and react if it is ever necessary to make adjustments for some new options or upgrades.
Bottom line is I think you are worried about nothing. You are not likely to find what you are looking for, because statistically speaking it is probably extremely rare. Sure you could go through 100s of thousands of top ten rankings validating sites, but I would think that might be a colossal mis-use of your time.