Forum Moderators: open
http://www.aol.com [htmlhelp.com]
http://www.yahoo.com [htmlhelp.com]
http://www.msn.com [htmlhelp.com]
http://www.microsoft.com [htmlhelp.com]
http://www.lycos.com [htmlhelp.com]
http://www.ebay.com [htmlhelp.com]
http://www.disney.com [htmlhelp.com]
http://www.excite.com [htmlhelp.com]
http://www.google.com [htmlhelp.com]
http://www.amazon.com [htmlhelp.com]
Is there some technical reason not to validate?
Is "laziness" a technical issue?
I personally have one feature (a set of <layers> tags) on each of my employer's pages that doesn't validate in HTML4... and I put it there so my CSS page layout functions properly in Netscape 4. I'd bet Netscape 4 is a big reason for non-validation in any site that uses anything more complex than plain-vanilla HTML/XHTML.
I also don't have a DTD specified... as I mentioned above: laziness. I wasn't aware of the significance of the declaration when I first built the site, and I don't want to go through the hassle of updating all the non-conforming pages unless I have to...
Lot's of things are required for rigorous coding that don't amount to anything at all for most major browsers, but they do add to the download time.
However, your point is well made. I noticed that the AOL home page even has some bad <td> tags, unclosed font tags and bad nesting! Now that's just plain sloppy.
.......
.......
Like I thought, I failed. I didn't even get an "F" Man, sometimes making cross browser compatible sites sucks!
Will the logos displayed from W3C influence the descision of the Yahoo editors? Who knows, but I want every posible thing in my corner. Will proper code actually make the site sell better? Probably not if the page displays properly in all browsers, but I don't want to have to worry about that aspect. Will boosting my ego by seeing those logos put money in my pocket, never. But I sleep better at night.
Take this little typo which caused another person to loose hair from scratching their head for too long. Instead of a <TR> tag, they had typed <TH> which caused the entire table to display the font in bold. They looked at their code for over a week and finally gave up and descided just to live with it. (??) I was looking at their page for another reason, but they asked me simple how this could happen. So, I looked at the source and just happen to spot the TH. A quick run to [validator.w3.org...] would have saved them hours of frustration and maybe a little propecia.
ALWAYS VALIDATE!
also, many html editors use colour codes to make errors easier to spot - some people might rely on this, combined with browser preview.
Those sites may even use an in-house (or alternative) validation system which is perfectly adequate for all practical uses but does not check to the same level as the validator you used. The one you used has spotted errors but they may be totally insignificant.
On the other hand, they may have used the same validator as you; found the errors, and decided they were of no consequence.
is there any evidence that a DTD reference is actually necessary?
Why bother validating your page as long as it works and looks OK in the major browsers??
Who cares? Do you really think asking Joe Bloggs not to shop at amazon because the site doesnt comply to W3C standards is going to have any effect??
Even the people that hire designers to design the sites are'nt ging to care cos the don't have the tecnical knoweldge in the first place.
leftmargin="0" topmargin="0" marginwidth="0" marginheight="0"
in the <body> tag. Is this due to the "" around the attrib? This is what it said was wrong...
Error: there is no attribute LEFTMARGIN for this element
Other than that I missed some ALT="" attributes.
Why bother validating your page as long as it works and looks OK in the major browsers??
Who cares? Do you really think asking Joe Bloggs not to shop at amazon because the site doesnt comply to W3C standards is going to have any effect??
Personal thing here, like netcommr stated, makes me feel like I did my part right.
Brian
the second reason is that if the site validates you can be pretty sure that you aren't going to get nasty surprises when the client's brother checks it out using Netscape 2 on an Atari
there isn't any point worrying about getting 100% compliance with any given validator...as long as you are making a conscious decision to use invalid mark up for a specific purpose
as for appearance being the only thing that matters...I think that is totally wasting the possibilities the medium offers
GWJ-The issue is that if you look at the HTML spec, there isn't a leftmargin (etc) attribute for the BODY tag. You did it right by quoting the attribute values, but that doesn't help if there isn't an attribute.
Wrong question. What you should ask is "Is there some technical reason TO validate?"
There is a perception (right or wrong) that it takes more effort to produce perfect code that validates. Since the imperfect code works well enough (I know thats not actually true all the time, but it is the perception, and it is true MOST of the time), perfect code is seen as a waste of time (everyone familiar with the 80/20 Rule?)
If browsers would refuse to display sites written in dodgy code, then validation would quickly become important.
>> The only thing you can do is make your code meet the standard and hope that they wrote the search engine spider to do the same
Or you can use the far more popular "That'll do" method :)
If the standard didn't keep changing, maybe people would stick to it, also it has to be easier to produce perfect code than to just turn out dross. Otherwise, due to feckless, thoughtless, worthless human nature no-one will bother. You MUST make it at least as easy to produce perfect code as it is to produce any old rubbish, or it'll never happen.
Currently, to get hold of a copy of the standard documentation, you actually have to make an effort to:
1) Go and find it
2) Then read and understand it
3) Then apply it
This is far too much to demand of most authors, because they do it part-time, or as a hobby, or can't or won't spare the time to learn how to do it right. For those of you who have already made the effort, or have the sort of brain that doesn't mind, great. For everyone else, forget it
Also, a significant number of designers just flat-out don't know how to do it, or even that they can
I've seen perfectly validated sites that look terrible in certain browsers... The sites mentioned, for the most part, look good no matter what you view them with.
The truth is... you can spend hours making sure your code validate and all you're going to end up with is a much larger HTML file that does exactly the same thing it did before.
That in mind... I don't think the designers are red faced at all... I think they've got a nice addition to their resume and a well designed, functional, website to give their client ;-)
GWJ: leftmargin="0" topmargin="0" marginwidth="0" marginheight="0" comes up as an arror, but Netscape don't read leftmargin and topmargin - you need marginwidth and marginheight for the page to look good in Netscape. Don't bother, I allways use it and have no problems with neither Yahoo nor Looksmart.
I think if browsers did comply to standards, so would designers, but you can really get away with bad html and it still looks good in browsers...
I didn't want to imply that I don't test my pages by actually viewing them in all 3 browsers, always do that. When writing the code I use IE just because its just easier for me, old habit. But I always test it in NS4.6, 4.7, 6, IE 5.5, Opera 5, and a glance in WebTV viewer. (always ugly). I bet those who have only seen the web on WebTV and then finally get a PC are just amazed. I figure I have most cases covered by those choices. I use both NS 4.6 and 4.7 because they render quite differently in many cases.
As far as why use a DTD statement. This was not so important in the past when all you had basically was html. But now with wireless formats and new markup, such as xml, you need to declare what your code is. In the future this will probably be a requirement for the document to even display in some viewers. Here are a few snippets from the standards...
"Markup declarations can affect the content of the document...examples are attribute defaults and entity declarations"
"Processors may signal an error if they receive documents labeled with versions they do not support"
"provide a grammar for a class of documents"
Although the above basically is for xml, you get an idea of the purpose and why they are needed. It is a good idea to get into the habit of adding them now before you absolutely have to.
------------
TallTroll, you statement I quoted below kind of rubs me the wrong way.
"This is far too much to demand of most authors, because they do it part-time, or as a hobby, or can't or won't spare the time to learn how to do it right. For those of you who have already made the effort, or have the sort of brain that doesn't mind, great. For everyone else, forget it
Also, a significant number of designers just flat-out don't know how to do it, or even that they can"
Well, for those that can't, maybe they need to find another job. And for those that won't learn, I hope they do something else. Sould the standards be lowered just so everyone can do it, that justs lowers the potential and quality of the internet. Would you want to get on an operating table and have a doctor who just barely passed do his thing on your heart. Or would you get on a plane worked on by an ignorant person. I know these are extreme examples, but I hope you get my point. Don't just get by in life man, go for the brass ring.
Personally, I don't want the net to just be a bunch of pages with everything centered down the middle of the page with cute bunny rabbits running across the page and way too many little animated gifs. Making things a little tougher does make those that can usually strive a little harder which produces quality and achieves a higher level of greatness. I want to see what is possible, not just what can be easily done. If those in the past had only done what was easy, you would not have a car to drive. In fact you would not even have computers or an internet.
well yes...but that's the designers that keep asking about problems with their sites each time a new generation of browsers arrive
I prefer to design for the web since I can't be bothered to spend every waking hour learning about each new browser version when it arrives
YMMV
This is simply not the case, of course it is good to get as close to standards as possible but sometimes you have to bemd the rules in order to achieve your objective. By ensuring that every tiny detail passes the validator doesnt mean the your pages are fantastic nor does it mean that you are guarenteed a number one spot in the search engines.
I'm not saying that validating your pages is a bad idea - i think thats great! However there are some people are more concerned about getting thier pages looking good in all browsers and pulling in the punters first. If they have time then they might run it through a validator just to make sure the code is OK.
As far as why Google doesn't validate (saw that in another thread), I know that with 100+ million searches per day, we try to save every byte we can. Having a lean home page saves us moolah.
I don't think it's optional whether you validate a site or not. I do think it's optional on whether you take the advice of the validator for specific reasons.
In Google's case, making their home page validate would add approximately 100 to 200 bytes to their output. Multiply that by 100 million, and yes that does turn into a huge amound of bandwidth for no obvious advantage. The things on their home page where they didn't follow the rules were fairly minor--missing DTD, some unquoted attributes, a missing alt attribute, and some missing type attributes.
But realize that Google is in a unique position on the web. As the #1 search engine, they don't have to worry about getting their site indexed by search engines.
Every time I eliminate non-validating code, I've just struck one small blow against the browser incompatibility problems.
I use the built-in validator in Homesite, which is great for letting me customize the validation criteria. I run it on every page while I work as a matter of habit, even if I'm just making small tweaks. And I especially validate before I save the page. (I dig this validator because it will check unsaved pages right in the middle of a workflow -- a nice feature)
This habit saves me lots of aggrivation by catching little bugs, cut and paste errors, etc, before they get wrapped into my page and forgotten -- and especially before I start screaming about "those buggy browsers!"
Of course, my pages don't validate either, because ad networks and statistical monitoring services prohibit me from fixing the errors in their code or porting it to XHTML. Complaining about it just gets me an "It works right now, so who cares about next week?" response.
The only thing you can do is make your code
meet the standard and hope that they wrote the
search engine spider to do the same. If that
doesn't scare you into doing it right, nothing will.
(-Xoc 10-24-2001)
That's a good point that needs emphasizing. More people do need to think in terms of a spider being a browser.
Just as the browser downloads and parses the page content to display to you - so does the spider download content to "display" to the indexer.
>Is there some technical reason not to validate?
I've had this site validated many times. I botched while doing what? Working around IE6 errors because of the doctype.
Then again, maybe they don't validate correctly ;)
Can anyone shed light on other spiders that may have trouble with XHTML links? Or even other code types (XML, or the swanky new Flash link indexing etc)
<devils advocate>
This also raises an interesting question in my mind:
If you were running an SE, would you emphasise indexing for quality (making the spider conform to published standards, and reject poorly coded sites), or quantity (spend time figuring out how to get it to read content and links, no matter how badly scrambled... perhaps at the expense of the comparatively small number of well coded sites that DO validate, due to lack of dev time)
Googles home page currently claims an index of 1,610,476,000 pages. Comments?
</devils advocate>
They Switched back Xoc because of so much upset [webmasterworld.com...]
They really took it on the chin over that. It threatened to over shadow XP entirely.
I'll post the link from that thread here, too, [opera.com ], because it is relevant here.
The irony of Microsoft's claim to standards-support is complete when you check the MSN.com site for compliance with the XHTML standard. Anyone can go to the W3C's standards validation service at [validator.w3.org...] and type in www.msn.com. The document returned demonstrates clearly that not a single document on their site adheres to W3C specifications, and many of their documents do not use XHTML at all, e.g. [careers.msn.com...]