Forum Moderators: open
however there are degrees of non validation - missing alt attributes in img tags is one thing (eg a non issue)
whereas unclosed tags is something else altogether (seriously bad), then there is a huge sea inbetween the two.
my adsense and amazong code
are just full of errors when i validate them.. special amazon..
adsense: all closed tags gives are not valid
amazong: there is a letter in the code that is producing a lot of errors in the validation..
so its ok to leave them like that?
so missing img alt tags is a non-issue? What about all those visually impaired people sat there listening to [spacer dot gif] [spacer dot gif] over and over until their ears bleed?
I do however, agree that although validation should be strived for, it isn't the end of the world if it isn't achieved (in most cases).
Doesn't give any of us smaller hominins a reason not to care.
Also possible, as topr8 hints, that those sites have employed PhD geeks to insert exactly the right bugs for various purposes.
That's not a reason to insert bugs at random. You can know if a bug is important by doing extensive testing.
Best to treat HTML bugs exactly as you would any other typo -- if you really mean to spell a wrod wrongly, do it. But that's not a reason to be illierate.
Google is never going to choke on a site because an image is missing an alt declaration - but there are plenty of circumstances where a page might render perfectly in IE but be garbage to the bots.
An example - putting your meta tags and title in the <body> instead of the <head>.
Play it safe and make your code validate. FWIW - adsense code seems to validate fine - Amazon and other affiliate links tend to use the & entity, replace it with & and it should validate.
Other than that, there is no search engine bonus given for validating (currently at least). There is no negative score for using a non-standard attribute, for instance.
There are hidden benefits to validation. The discipline involved tends to extend into many areas of development - even spelling and grammar in the content and copy. It imparts a rigor and disdain for sloppiness that serve you very, very well. And more and more over time, by learning what is and isn't valid, your mark-up and page templates, your designs in general, tend to get better and better.
<edited for spelling errors>
[edited by: tedster at 11:58 pm (utc) on July 18, 2005]
That sounds really counter-intuitive, especially if you've just been through the rigmarole of validating an old site for the first time. But think of it this way:
Whenever your markup strays from the standard, the browser has to guess what you meant. By and large, most common browsers tend to make the same guesses, but that's not always the case so you have to test your page in each one. So let's say you have 100 pages to check, each with 10 moderate markup errors, and you're testing in IE 5, IE 5.5, IE 6, Firefox, Safari, Netscape 4.x, Opera and Lynx. You've just burdened yourself with 8000 checks. Don't forget you'll now have to repeat those checks every time you make a change, just so you can be reasonably sure you haven't really screwed up your pages.
Suppose you were going for valid markup all along. You'll find that browser behaviour is suddenly more consistent, so you need to check far fewer browser versions. You'll find that when you make a change, you only need to check the validator, and maybe one or two browsers, before you're confident that nothing broke.
In short, the validator is a guide. It'll hold your hand, reassure you, and keep you from looking silly (after all, nothing screams amateur more than markup appearing on the page, or a style vanishing abruptly half way through a paragraph).
Very well put. I'm also noticing that with that rigour comes steadily decreasing page HTML size, it just keeps dropping, I seem to be averaging about 4 or less kB per page now. Not as a result of validation, but I think validation does just what tedster says, it decreases the randomness factor, makes you more and more in control of what you are creating. More control seems to translate to tighter code from what I see.
The benefits re debugging as asquithea notes are also not trivial, only once a page is known valid can you know that a rendering error is the result of a browser bug. Knowing this can save you hours, if not days, of debugging time. And also helps teach what the browser bugs are so you can avoid them in the future.
I'm also seeing very tightly coded pages rank extremely well, probably because the content starts almost immediately, hard to say for sure.
Of course the downside of that disdain is that I can't work with bad code any more, always have to do a full rewrite.
steadily decreasing page HTML size
I recently had quite a thrill. I designed a website for a client - 4.01 strict - and when they approved the template, I laid in the content for the Home Page and linked up all the graphics for that page...and the total page weight, including images, css and all, was just under 9kb. Oh my, am I getting to be a code-anorexic?
I threw in a nifty js rollover effect at that point, because there was plenty of room to play!
ah tedster, glad you're still top of your game ... i've always considered you the uncrowned king of tight coding!
>>What about all those visually impaired people sat there listening to [spacer dot gif] [spacer dot gif] over and over until their ears bleed?
haha, good point fwordboy, two things to say about that ...
(1) what! are people still using spacer.gif? because if they are then valid code is way down the list of their problems :)
(2) i admit to never having heard a talking web browser or what ever software is used in that case, however i would think that the software is to blame if it cannot recognise spacer graphics and starts reading their filenames aloud - solutions should be real world, not ideal world
Yet another reason to use valid code, in this case the valid code is alt="", which tells the screen reader to use nothing, an empty space.
Re page size: I'd say one test for yourself that you are getting better at this stuff is when you can create a page that is more complex than older pages you've made, and has a significantly smaller file size.