homepage Welcome to WebmasterWorld Guest from 54.197.94.241
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / WebmasterWorld / Webmaster General
Forum Library, Charter, Moderators: phranque

Webmaster General Forum

    
Does google prefer validated pages?
w3c xhtml validation
flapane




msg:3876039
 7:25 pm on Mar 21, 2009 (gmt 0)

Hi,
while having some layout issues in some browser, I decided to validate every page in order to solve every possible problem.
I was asking myself if google prefers a validated website in result rankings over a non validated result page, or if validation has only the purpose to avoid glitches and strange visualization problems?

 

nomis5




msg:3876089
 9:28 pm on Mar 21, 2009 (gmt 0)

I don't think Google knows and I I'm even surer that they don't care. If your page works in IE and Firefox you have 90% of the market.

flapane




msg:3876108
 10:01 pm on Mar 21, 2009 (gmt 0)

I tought that maybe crawler bots could notice this.
Thanks anyway

jbinbpt




msg:3876122
 10:59 pm on Mar 21, 2009 (gmt 0)

I would think that any search engine would have an easier time with validated pages. The one thing we do know is that they give weight to proper use of the <h> tags.

If a page is not validating, why would you thing it would be properly spidered? Proper code gives the spiders direction.

You should try and clean up the code as much as possible. Start at the top of the page. Validation errors cascade, so cleaning one up may resolve many.

flapane




msg:3876129
 11:09 pm on Mar 21, 2009 (gmt 0)

Yeah, I couldn't figure out how much stuff was nested after 10 years of modification in some pages... there even were maiusc tags, propretary tags and so on... and they all were hidden very well.

g1smd




msg:3876134
 11:20 pm on Mar 21, 2009 (gmt 0)

Google says that validation is not a factor in either the scoring or the ranking of content pages.

However do be aware that if your code contains certain types of coding errors, there is the possibility that at least a part of the page content may fail to be spidered at all. If it isn't spidered then it can't be indexed or ranked.

webtress




msg:3876210
 4:36 am on Mar 22, 2009 (gmt 0)

Last I checked Googles home page isn't valid, it would be like the pot calling the kettle black.

buckworks




msg:3876220
 5:31 am on Mar 22, 2009 (gmt 0)

Fully valid code (or as close as you can realistically get) can't possibly hurt, and I believe there can be ranking benefits, at least indirectly.

Remember that one of the things Google notices is how other sites respond to you. Over time, the difference between providing a glitch-free user experience for "90% of the market" or aiming for 99% could quite conceivably result in one site acquiring a stronger link profile than another. That's not guaranteed, nothing is ever guaranteed in SEO, but considering that competitive SEO is often a game of inches no advantage should be tossed aside lightly.

Never turn down a chance to do something better than the other guy.

flapane




msg:3876321
 11:26 am on Mar 22, 2009 (gmt 0)

probably the best answer, buck.
Thank you all.

daveVk




msg:3876322
 11:47 am on Mar 22, 2009 (gmt 0)

validation has only the purpose to avoid glitches and strange visualization problems?

If it causes visualization problems you will probably be aware of it, if it causes indexing problems will you know ? Minor violations are probably are not an issue, but a missing end tag for example could easily result in the structure of the page being misinterpreted.

nealrodriguez




msg:3876422
 4:59 pm on Mar 22, 2009 (gmt 0)

if you've got the time to validate - go for it; however, i have seen many sites ranking for competitive terms without coming close to validating; that doesn't mean you shouldn't validate - that may be the facet of webmastership that you could excel above the other guy.

flapane




msg:3876428
 5:08 pm on Mar 22, 2009 (gmt 0)

I was into editing and converted html pages to xhtml, so I couldn't give up on validating all my pages too, but heck it was a long and such boring job...

snickles121




msg:3876635
 11:04 pm on Mar 22, 2009 (gmt 0)

The easiest way to help validate and fix validation problems with webpages for me was to use HTML Tidy. It only takes a few seconds a page to validate and fix errors with this program. You can use the free version of Eversoft and open your web pages in it. It has the Html Tidy program built into it.

Im not sure how much weight Google places on validated or unvalidated pages (if you validate google's homepage, they have many errors), but it can't hurt and mostly takes little effort.

g1smd




msg:3876663
 12:03 am on Mar 23, 2009 (gmt 0)

Google are coding for speed of delivery and minimising their bandwidth usage.

They haven't got to to consider being spidered and indexed by search engines.

You have.

webtress




msg:3876715
 4:38 am on Mar 23, 2009 (gmt 0)

Validation should be done to ensure a consistent rendering of pages across different browsers not to impress Google. Obviously the could care less.

BradleyT




msg:3877071
 5:59 pm on Mar 23, 2009 (gmt 0)

For me it's just a thing of pride for working at a thankless job.

The content writer who just completed her degree in multimedia web design creates absolutely horrible markup that I always have to go and clean up. I told her she should ask for her money back and she thought I was joking :/

pageoneresults




msg:3877078
 6:09 pm on Mar 23, 2009 (gmt 0)

No, Google prefers pages it can crawl. They don't have to be valid. There could be 100s of Errors and/or Warnings and all will appear fine. It is the magnitude of those that counts. Some errors are somewhat fatal. If you can view cache, usually a borken piece of HTML that Google couldn't process will be evident in the information it was able to retrieve for cache. I don't see it too much but when I do, it is a rather funky mistake that Google's error routines couldn't cope with.

Validating your pages just ensures that you don't have to worry about the above happening. At least you'll know all of your elements are in proper working order. That for me is one of the better feelings of doing this. I see those green checkmarks in my Developer Toolbar showing valid HTML/XHTML/CSS and I'm one happy camper.

I've also spent the last few years building a toolset that mimics the bots and browser. We've spent an untold number of hours performing updates to deal with FAILed syntax in web pages. When we first launched, we were getting errors almost every day because we didn't think about this, or that. After 2 years of that crap, I think we've finally got it! In the process though, I did learn a bit about how bots traverse code and what can make or break them. I've seen some pretty nasty HTML get processed by our bot. I'm sure Google is light years ahead of us in that area.

flapane




msg:3877085
 6:25 pm on Mar 23, 2009 (gmt 0)

Yeah, I agree, it is a thing of pride.
Now I am sure that if I click on the w3c logo on any page, it shows me the green check, it should be a nice thing for more experienced visitors, btw.
In doing this job, I finally understood why so many people prefer a notepad instead of a wysiwyg, at least for this kind of things.

pageoneresults




msg:3886362
 2:02 pm on Apr 6, 2009 (gmt 0)

Well, I guess you have your answer. Google doesn't care. And the SEO Gurus say it doesn't matter. That must mean that Google doesn't prefer validated pages. I'm off to break a few hundred thousand pages to see what happens. :)

flapane




msg:3886365
 2:07 pm on Apr 6, 2009 (gmt 0)

let us know the results :D

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Webmaster General
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved