Welcome to WebmasterWorld Guest from 54.158.238.108

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Do AAA, W3C validations help rank websites better?

     
7:03 am on Jul 7, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:July 6, 2009
posts: 58
votes: 0


Do AAA, W3C validations help rank websites better?
9:22 am on July 8, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:July 6, 2009
posts:58
votes: 0


Observation. Why do you think it does, other than "it should"

The way you answer, I just though someone sneaked you the google algo.
9:24 am on July 8, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:July 6, 2009
posts:58
votes: 0


*thought
9:58 am on July 8, 2009 (gmt 0)

Full Member

10+ Year Member

joined:Mar 31, 2005
posts:337
votes: 0


If you actually could *directly* improve your rankings by having valid code or a Bobby-approved site then don't you think Google would tell us that?

It would be such an easy thing to add to their 'Webmaster Quality Guidelines' and it's the sort of thing they'd love to do... "We reward good practice". I can hear Matt Cutts now.

They don't care. Google.com fails validation with 40 errors and 2 warnings. They're still using <font> tags!

If you work on having better code and accessibility then this might have an eventual, knock on effect on rankings - for example because you get 'approval' links. But we're talking the butterfly effect here - NOT a clear, reliable correlation between the two.

Can you seriously imagine an algorithm that demotes sites that fail validation? It counts the number of errors, decides how serious each one is, then adjusts ranking based on that? That's like saying that Google's algorithms would penalise for bad grammar and incorrect sentence structure.

2:52 pm on July 8, 2009 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5730
votes: 92


Re bad grammar ...

No one knows all the things Google might be weighing and measuring and comparing, but bad grammar and sloppy sentence structure wouldn't exactly be "signals of quality", would they?

Re empty (mull) ALT text:

Someone earlier grumbled that including ALT="" to please the validator just wastes characters.

That person needs to understand the practical value of ALT="" .

ALT="" tells a screen reader or braille display that the image does not add functional information to the page and can be safely ignored. That reduces the amount of superfluous information that a non-sighted user must wade through ... and that seriously improves the user experience for those users.

Null ALT text is suitable for images of lines, borders, decorative doodads, spacer images, images which are contained in a link which already contains text, and so on.

3:36 pm on July 8, 2009 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1511
votes: 176


Twas me to whom you refer, but forsooth thou hast mistaken me.

I wasn't complaining about wasted characters, I was indicating it does nothing from an SEO perspective. I merely noted it marginally increased code bloat, but mostly as an aside.

Missed ALTs really do not do anything that I can detect, SEO-wise, at least at the moment. And thats what the OP was asking- does AAA and W3C compliance aid rankings. The answer is NO.

Is AAA and W3C compliance good practice? Usually.

Will a W3C compliant page AVOID serious structural problems that depress rankings? Of course

Will closing all your tags, avoiding <font> and target="_blank" increase your rankings? Not that I can detect, and certainly not that canons of Google have even remotely suggested. And I'm sure they would- as noted by a previous poster, this is exactly the kind of stuff they would stick in their SEO guide.

For clarity, I'm not arguing that accessability isn't a good aim in its own right. Only that W3C compliance is not a ranking factor.

5:23 pm on July 8, 2009 (gmt 0)

Full Member

10+ Year Member

joined:Mar 31, 2005
posts:337
votes: 0


bad grammar and sloppy sentence structure wouldn't exactly be "signals of quality", would they?

Well, that depends on a few things...

Is it user generated content or not? The majority seem to be borderline illiterate these days, and even those that aren't frequently use slang, txt speak and generally cut corners to express themselves.

Also, what sort of error are we talking about : 'incorrect according to a University professor' or 'would fail a junior school English exam'?

Much of the good advice I've read on forums was from people for whom English was a second language. Quality advice from peer-acknowledged experts, but still bad grammar and sloppy sentence structure.

8:11 pm on July 8, 2009 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 1, 2003
posts:438
votes: 0


Having read the often vociferous and scathing comments in this thread, a quick reminder folks: however right you think you are about something, it's still only your OPINION.

The only thing we can say about any aspect of the google algo for sure.... is that nobody knows for sure!

With that in mind wouldn't it be a good idea to share opinions courteously and keep an open mind?!

Just a thought...

8:57 pm on July 8, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 27, 2001
posts:1138
votes: 0


Just a thought...

...well, no, because it results in wish-fulfilment fantasies based on unmitigated twaddle being discussed as though they were equally as valid as battle-tested experience. But carry on, 'twas ever thus.

As a matter of interest, google.com = 40 errors, 2 warnings.

9:24 pm on July 8, 2009 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5730
votes: 92


Not all validation errors or warnings are equally serious. None of Google's alleged errors are the sort that would stop a spider in its tracks; they're minor issues like ampersands in URLs, unquoted attributes, deprecated tags and the like.
9:58 pm on July 8, 2009 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 1, 2003
posts:438
votes: 0


I have often assumed W3C validation to xhtml strict 1.0, along with things like Verisign certificates and Rackspace or other top-tier hosting to be pretty good signs of quality for google to separate the wheat from the chaff.

I've no scientific "battle-tested" (I thought this forum was about google SEO, not fighting the Taliban?!) proof, but it certainly hasn't done me any harm.

BTW, Stever, are you saying we can all benefit by being more narrow-minded and rude? Interesting idea.

[edited by: suggy at 10:10 pm (utc) on July 8, 2009]

9:59 pm on July 8, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 27, 2001
posts:1138
votes: 0


>>Not all validation errors or warnings are equally serious. None of Google's alleged errors are the sort that would stop a spider in its tracks; they're minor issues like ampersands in URLs, unquoted attributes, deprecated tags and the like.

Well, I agree entirely with you, buckworks, but the point was (initially in the question and then with increasingly didactic answers to one's own question) that validation (and AAA compliance) helps with ranking, which is as simplistic and misleading as saying that using correct grammar helps with ranking (a point which you yourself commented on).

10:10 pm on July 8, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 27, 2001
posts:1138
votes: 0


>>BTW, Stever, are you saying we can all benefit by being more narrow-minded and rude? Interesting idea.

I was responding more to your whole post, but feel free to throw up as many straw men as you wish. I've always found that not eating cute Golden Retriever puppies on Fridays has never done me any harm with ranking, by the way.

To be serious, I construct well-formed pages because it avoids problems in different browsers. I write coherent prose on my sites because it attracts visitors.

Indirectly, those things may have an effect on my ranking. To say that validated or AAA-compliant code (or correct grammatical usage) has a direct effect on ranking is bluster of the highest order that can easily be disproved by experience, testing or simply taking the time to study the rankings.

But no, we should sit here and simper about what search engines might do and how Adsense earnings could be affected and how any one of 97 penalties could be applied to a website with total disregard to reality. I'm sorry if that sounds rude (I would prefer blunt) but in days gone by these forums were less vacuous and 1000 times better for it.

8:13 am on July 9, 2009 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1511
votes: 176


If I've come across as rude, I apologise. But as stever suggests, being accomodating to bad theory does everyone a disservice, and is the cause of webmaster myth.

Some things are opinion. Some things are verifiable. This is one of the latter. I invite everyone to test it.

Write some perfectly valid pages, get them to rank, then make some minor non-valid changes such as the following.
add target="_blank" to link code
Use <font> tags
drop alt="" from images

IT MAKES NO DIFFERENCE TO RANKING. REALLY, IT DOESN'T.

Now, again, accessability is a worthy aim in its own right. Any webmaster concerned with user experience should make this a priority. It certainly helps with conversion, which is one of the most important metrics any monetised site should focus on, along with trafic.

10:05 pm on July 9, 2009 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5730
votes: 92


To say that ... has a direct effect on ranking is bluster of the highest order

Stever, it's just as much bluster for you to assert definitively that it has no effect.

The only thing we can state definitively is that falling short of full validation will not stop a page from ranking well if it does enough other things right.

then make some minor non-valid changes

I invite you to try the same experiment with some major non-valid changes, something drastic that could stop a spider in its tracks, and report back to us in a few weeks.

10:28 pm on July 9, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 27, 2003
posts:844
votes: 0


I agree if the sites not validating and it's stopping spiders, then of course it's going to hurt your rankings.

However, I have yet to see a site rank any better for validating over a non validating site (the non validating codes does not effect spider crawls)

I've tested this myself on various sites and that's what I've come up with so I no longer spend hours trying to fix something. I'll run a spider checker to make sure it's readable and if so, to heck with validation.

If Google isn't validating their own cheesy plain page, I'm certainly not going to worry about.

And for those saying Google would tell you if it helped, yah right, and I saw the tooth fairy last night. Google might give generic examples but they will NEVER give exact examples for obvious reasons.

Many people take whatever Matt Cutts (AKA Googleguy) says as the new testament when all he is just a propaganda mouth piece at this point.

10:50 pm on July 9, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member suzyuk is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Oct 1, 2002
posts:5199
votes: 0


I have a problem with statements like this:
If Google isn't validating their own cheesy plain page, I'm certainly not going to worry about.

though don't take that quote personally I've seen it in one way or another over the years..

because that's their own page, they are in a position to manipulate its rank on order to fulfill a self satisfying prophecy, are they not?.. and don't forget its inbounds...copy their page and I bet you don't rank so well

do not use a manipulated site (i.e. a very big one) to make your case

1:02 am on July 10, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 27, 2003
posts:844
votes: 0


do not use a manipulated site (i.e. a very big one) to make your case

OK, I just used them because it's ironic. Pick out any of the top sites in any industry and more often than not the ones ahead in the serps do not have validated pages.

There is not one shred of evidence that validated pages help you in the SERPS. Unless of course it's not validating because it can't be crawled which is another problem all together.

If however, people feel the need to validate their pages, then more power to them.

4:20 am on July 10, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


I believe it was in that long video from WordCamp that Matt Cutts said something like this [paraphrase]: "Normal people write code with errors, it just happens. 40% of all pages have syntax errors and Google can't afford to penalize these pages."

He went on to say that in the future they might look at including validation and accessibility signals into the algo, but the results that generates would still need to pass many tests -- and the end users would still provide the final measure.

My own preference is to validate as much as possible, and understand when a code does not validate - being sure that the errors will not obstruct crawling. Sometimes putting out completely valid code is not practical in the present moment. Errors that you don't even know about are where the gremlins can hide.

8:14 am on July 10, 2009 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1511
votes: 176


I invite you to try the same experiment with some major non-valid changes, something drastic that could stop a spider in its tracks, and report back to us in a few weeks.

That's straw man, and you should know it. Of course structural problems are an issue, no one has said otherwise. The question isn't "Do major flaws in coding affect ranking", the OP question was:
Do AAA, W3C validations help rank websites better?

The logically equivalent question is, does non-validating pages automatically rank worse? Put a <font> tag in, non-validate you page, and the answer is clearly NO.

Now, heres some selected quote from myself that I hope will clarify my position viz the importance of structure and accessability, versus the statement that validation does not matter.

AAA and W3C gives your site structure. Thus, it will rank better than unstructured sites. However, pure SEO also requires structure. It too will rank better than non structure.

Pure-SEO may not validate. Re-writing it to validate might make it worse, SEO-wise. It will NOT give additional ranking simply because it validates- Google is not a validator.


The ability to structure pages is, however, critical.

Is AAA and W3C compliance good practice? Usually.

Will a W3C compliant page AVOID serious structural problems that depress rankings? Of course


Now, again, accessability is a worthy aim in its own right.

8:58 am on July 10, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:July 6, 2009
posts:58
votes: 0


What would you do if you owned you search engine?
You would give a point for good structure.
If I owned one I would do that.

So, why would google do something different

10:35 am on July 10, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 29, 2002
posts:981
votes: 1


So, why would google do something different

You're assuming that HTML expertise correlates with expertise in other areas (eg breeding budgerigars). Clearly this is not the case.

Websites are often run by people who have v.little HTML knowledge but who do have in-depth expertise and knowledge about their area of interest. That's why Google would do something different.

Linus Torvalds' homepage does not validate. Does this indicate he's not an expert on Linus Torvalds, or just that he's not an expert on HTML?

11:17 am on July 10, 2009 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1511
votes: 176


What would you do if you owned you search engine?
You would give a point for good structure.

Dammit, no one is arguing differently.

If you are unsure of how to structure a page, W3C validation shows you have done nothing wrong, though there is no certainty you have done everything right.

If you follow AAA, you will have a very nice page, with good structure and lots of helpful tags that spiders eat up. IT WILL UNDOUBTABLY BE PRESENTING CONTENT IN A GOOGLE-FRIENDLY WAY. But, it will NOT be getting extra points, over and above the very good points it will inherently receive.

I feel this thread is degenerating due to an imprecision in terms, or a lack of willingness of participants to read and understand the contributions of others.

In a very pedantic way, W3C complaince and AAA does not confer EXTRA points. However, they are an excellent aid to maximising the value of your content, and frankly anyone who thinks they help ranking should focus on them, as they will not have grasped the more subtle nuances of pure SEO (even just "whitehat" SEO).

Anyone who wants to push the envelope and try new things to increase their ranking SHOULD NOT WORRY IF THE PAGE DOES NOT VALIDATE. They should realise something is wrong when ranking and/or trafic tanks. But these are the experiences that improve your understanding, making you better in the long run

10:44 pm on July 10, 2009 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5730
votes: 92


they are an excellent aid to maximising the value of your content

I'd agree with that. IMHO that is a good way to put it.

they will not have grasped the more subtle nuances of pure SEO

Not so sure I'd agree with that, but it would depend what you mean by "subtle nuances of pure SEO". Care to elaborate?

That's straw man, and you should know it.

Nope, not a straw man, just an extreme case of what we're talking about.

imprecision in terms

Yes, that's a challenge.

In this context, the imprecision is that issues which keep a page from validating fully could vary from trivial to truly serious.

The answer to the OP's question, "Do AAA, W3C validations help rank websites better?" would depend a great deal on what condition the page was in to start with and what sort of errors got cleaned up on the way to validation.

SHOULD NOT WORRY IF THE PAGE DOES NOT VALIDATE

Yes ... as long as they understand WHY it isn't validating and have cleaned out any true "gremlins".

7:32 am on July 11, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 27, 2001
posts:1138
votes: 0


If we are going to expand the discussion into whether validation, accessibility or decent grammar and linguistics are intrinsically "good" or "bad" for a site, I don't really think that there is any discussion to be had, since just about everyone here would agree.

As far as those relate directly to SEO, though, I would argue (with tongue lodged firmly in cheek) that the habit of people such as the original poster of placing sitewide external links to the validation and accessibility websites might have far more direct influence on their rankings than their beloved well-formed (x)html.

This 54 message thread spans 2 pages: 54
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members