homepage Welcome to WebmasterWorld Guest from 54.145.172.149
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Code, Content, and Presentation / HTML
Forum Library, Charter, Moderators: incrediBILL

HTML Forum

This 32 message thread spans 2 pages: 32 ( [1] 2 > >     
Embarassing that major sites don't validate
Xoc

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 9:49 pm on Oct 23, 2001 (gmt 0)

Go to [htmlhelp.com ] and validate the home page of the 10 top web properties, listed below. The results are embarassing. Not a single one of these site validated without a glitch. Most do not have a DTD specified. What is so hard about validating that these sites can't do it? Is there some technical reason not to validate? Someone explain it to me.

http://www.aol.com [htmlhelp.com]
http://www.yahoo.com [htmlhelp.com]
http://www.msn.com [htmlhelp.com]
http://www.microsoft.com [htmlhelp.com]
http://www.lycos.com [htmlhelp.com]
http://www.ebay.com [htmlhelp.com]
http://www.disney.com [htmlhelp.com]
http://www.excite.com [htmlhelp.com]
http://www.google.com [htmlhelp.com]
http://www.amazon.com [htmlhelp.com]

 

mivox

WebmasterWorld Senior Member mivox us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 1150 posted 9:56 pm on Oct 23, 2001 (gmt 0)

::chuckle::

Is there some technical reason not to validate?
Is "laziness" a technical issue?

I personally have one feature (a set of <layers> tags) on each of my employer's pages that doesn't validate in HTML4... and I put it there so my CSS page layout functions properly in Netscape 4. I'd bet Netscape 4 is a big reason for non-validation in any site that uses anything more complex than plain-vanilla HTML/XHTML.

I also don't have a DTD specified... as I mentioned above: laziness. I wasn't aware of the significance of the declaration when I first built the site, and I don't want to go through the hassle of updating all the non-conforming pages unless I have to...

Xoc

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 10:17 pm on Oct 23, 2001 (gmt 0)

I could understand if it was some buried page way down in the site, but we're talking about the home page of the site!

MaliciousDan

10+ Year Member



 
Msg#: 1150 posted 10:19 pm on Oct 23, 2001 (gmt 0)

This validator seems to work as well as most of the others I've seen... that's not a good thing. Not that they can't spot errors, it's just that they spot errors that aren't wrong, maybe things would be different if MS and NS worked the same...

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 1150 posted 10:30 pm on Oct 23, 2001 (gmt 0)

Sometimes on a very "fat" page, I've trimmed all kinds of things from the code that are, strictly speaking, a requirement. Alt attributes for every image. Quote marks around alphanumeric attributes.

Lot's of things are required for rigorous coding that don't amount to anything at all for most major browsers, but they do add to the download time.

However, your point is well made. I noticed that the AOL home page even has some bad <td> tags, unclosed font tags and bad nesting! Now that's just plain sloppy.

caine

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 10:48 pm on Oct 23, 2001 (gmt 0)

i generally go for making sure the page is accepted in ie>4 nn>3 and opera, the W3C validation, i would like to see someone who write's to that spec and does well with search engines, due to the fact the validator itself is a pretty crude search engine.

TheLynxEffect

10+ Year Member



 
Msg#: 1150 posted 11:31 pm on Oct 23, 2001 (gmt 0)

I don't think I even want to validate my site. Oh well, here goes.....

.......

.......

Like I thought, I failed. I didn't even get an "F" Man, sometimes making cross browser compatible sites sucks!

netcommr

10+ Year Member



 
Msg#: 1150 posted 12:42 am on Oct 24, 2001 (gmt 0)


I just published a site today. Its a small project I started last Thurs., just a 10 pager for a client to taget a specific area (set of terms) of his market. I tried that validater posted above and used the "Entire Site" option. Guess what, no errors on any page. Why? Because I do this as a standard practice. This site MUST be accepted to Yahoo and Looksmart or I don't get paid. I want the HTML and CSS logos from W3C on that site when I submit it to the directories. I take pride in the pages I put on the web. I never want to hear I did a poor job, at least from a coding stand point.

Will the logos displayed from W3C influence the descision of the Yahoo editors? Who knows, but I want every posible thing in my corner. Will proper code actually make the site sell better? Probably not if the page displays properly in all browsers, but I don't want to have to worry about that aspect. Will boosting my ego by seeing those logos put money in my pocket, never. But I sleep better at night.

Take this little typo which caused another person to loose hair from scratching their head for too long. Instead of a <TR> tag, they had typed <TH> which caused the entire table to display the font in bold. They looked at their code for over a week and finally gave up and descided just to live with it. (??) I was looking at their page for another reason, but they asked me simple how this could happen. So, I looked at the source and just happen to spot the TH. A quick run to [validator.w3.org...] would have saved them hours of frustration and maybe a little propecia.

ALWAYS VALIDATE!

stavs

10+ Year Member



 
Msg#: 1150 posted 1:29 am on Oct 24, 2001 (gmt 0)

You can run a page through the validator and it will tell you if there are errors in your code - BUT it won't tell you how it will look in IE and Netscape. I assume that most people use browser preview to spot errors, before publishing.

also, many html editors use colour codes to make errors easier to spot - some people might rely on this, combined with browser preview.

Those sites may even use an in-house (or alternative) validation system which is perfectly adequate for all practical uses but does not check to the same level as the validator you used. The one you used has spotted errors but they may be totally insignificant.

On the other hand, they may have used the same validator as you; found the errors, and decided they were of no consequence.

is there any evidence that a DTD reference is actually necessary?

knighty

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 2:10 pm on Oct 24, 2001 (gmt 0)

Just to play devils advocate here...

Why bother validating your page as long as it works and looks OK in the major browsers??

Who cares? Do you really think asking Joe Bloggs not to shop at amazon because the site doesnt comply to W3C standards is going to have any effect??

Even the people that hire designers to design the sites are'nt ging to care cos the don't have the tecnical knoweldge in the first place.

GWJ



 
Msg#: 1150 posted 2:55 pm on Oct 24, 2001 (gmt 0)

I ran my pages just to see, it really did not like:

leftmargin="0" topmargin="0" marginwidth="0" marginheight="0"

in the <body> tag. Is this due to the "" around the attrib? This is what it said was wrong...

Error: there is no attribute LEFTMARGIN for this element

Other than that I missed some ALT="" attributes.

Why bother validating your page as long as it works and looks OK in the major browsers??
Who cares? Do you really think asking Joe Bloggs not to shop at amazon because the site doesnt comply to W3C standards is going to have any effect??

Personal thing here, like netcommr stated, makes me feel like I did my part right.

Brian

Eric_Jarvis

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 3:18 pm on Oct 24, 2001 (gmt 0)

knighty...the first reason for running a site through a validator is to pick up typos and mark up mistakes...this is extremely useful

the second reason is that if the site validates you can be pretty sure that you aren't going to get nasty surprises when the client's brother checks it out using Netscape 2 on an Atari

there isn't any point worrying about getting 100% compliance with any given validator...as long as you are making a conscious decision to use invalid mark up for a specific purpose

as for appearance being the only thing that matters...I think that is totally wasting the possibilities the medium offers

Xoc

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 3:42 pm on Oct 24, 2001 (gmt 0)

Remember that every stupid little search engine spider out there is really another web browser. Not Netscape, not IE, not even Opera. It's a totally different web browser. How do you know that your HTML looks good to it??? You can't look to see what it is seeing. The only thing you can do is make your code meet the standard and hope that they wrote the search engine spider to do the same. If that doesn't scare you into doing it right, nothing will.

GWJ-The issue is that if you look at the HTML spec, there isn't a leftmargin (etc) attribute for the BODY tag. You did it right by quoting the attribute values, but that doesn't help if there isn't an attribute.

TallTroll

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 4:15 pm on Oct 24, 2001 (gmt 0)

>> Is there some technical reason not to validate? Someone explain it to me.

Wrong question. What you should ask is "Is there some technical reason TO validate?"

There is a perception (right or wrong) that it takes more effort to produce perfect code that validates. Since the imperfect code works well enough (I know thats not actually true all the time, but it is the perception, and it is true MOST of the time), perfect code is seen as a waste of time (everyone familiar with the 80/20 Rule?)

If browsers would refuse to display sites written in dodgy code, then validation would quickly become important.

>> The only thing you can do is make your code meet the standard and hope that they wrote the search engine spider to do the same

Or you can use the far more popular "That'll do" method :)

If the standard didn't keep changing, maybe people would stick to it, also it has to be easier to produce perfect code than to just turn out dross. Otherwise, due to feckless, thoughtless, worthless human nature no-one will bother. You MUST make it at least as easy to produce perfect code as it is to produce any old rubbish, or it'll never happen.

Currently, to get hold of a copy of the standard documentation, you actually have to make an effort to:

1) Go and find it
2) Then read and understand it
3) Then apply it

This is far too much to demand of most authors, because they do it part-time, or as a hobby, or can't or won't spare the time to learn how to do it right. For those of you who have already made the effort, or have the sort of brain that doesn't mind, great. For everyone else, forget it

Also, a significant number of designers just flat-out don't know how to do it, or even that they can

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 1150 posted 7:56 pm on Oct 24, 2001 (gmt 0)

Here's a resource for creating a custom DTD [htmlhelp.com] which helps when you want/need to use non-standard markup such as the EMBED tag or the leftmargin attribute in a BODY tag.

IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 8:39 pm on Oct 24, 2001 (gmt 0)

I think someone already said this but... Most of those sites don't validate because the simple truth is that web builders don't design for W3C... they design for browsers.

I've seen perfectly validated sites that look terrible in certain browsers... The sites mentioned, for the most part, look good no matter what you view them with.

The truth is... you can spend hours making sure your code validate and all you're going to end up with is a much larger HTML file that does exactly the same thing it did before.

That in mind... I don't think the designers are red faced at all... I think they've got a nice addition to their resume and a well designed, functional, website to give their client ;-)

ulstrup

10+ Year Member



 
Msg#: 1150 posted 9:00 pm on Oct 24, 2001 (gmt 0)

Like Stavs I'll ask for the value of a DTD in an HTML document... Tedster gave some answer, but has anybody experienced higher rankings for pages with a DTD ?

GWJ: leftmargin="0" topmargin="0" marginwidth="0" marginheight="0" comes up as an arror, but Netscape don't read leftmargin and topmargin - you need marginwidth and marginheight for the page to look good in Netscape. Don't bother, I allways use it and have no problems with neither Yahoo nor Looksmart.

I think if browsers did comply to standards, so would designers, but you can really get away with bad html and it still looks good in browsers...

netcommr

10+ Year Member



 
Msg#: 1150 posted 2:24 am on Oct 25, 2001 (gmt 0)


I can think of two main reasons I would want to validate any publicly viewable page. One was already stated by Xoc, you have no idea how your site will look to the spiders. Your best chance to make sure they see what you want them to see is by following the standards. Always remember this is where they start when spiders are developed. The second main reson I alwasy use them is I have no idea what the next version of browsers will display or have compliance for. Look at the newbie on the block, Opera. This browser is the most strict of the main 3 browsers used. It was not developed like IE to handle poorly written code, it is fast because it works mainly off what the standards have set. Who know how strict the next one will be.

I didn't want to imply that I don't test my pages by actually viewing them in all 3 browsers, always do that. When writing the code I use IE just because its just easier for me, old habit. But I always test it in NS4.6, 4.7, 6, IE 5.5, Opera 5, and a glance in WebTV viewer. (always ugly). I bet those who have only seen the web on WebTV and then finally get a PC are just amazed. I figure I have most cases covered by those choices. I use both NS 4.6 and 4.7 because they render quite differently in many cases.

As far as why use a DTD statement. This was not so important in the past when all you had basically was html. But now with wireless formats and new markup, such as xml, you need to declare what your code is. In the future this will probably be a requirement for the document to even display in some viewers. Here are a few snippets from the standards...

"Markup declarations can affect the content of the document...examples are attribute defaults and entity declarations"

"Processors may signal an error if they receive documents labeled with versions they do not support"

"provide a grammar for a class of documents"

Although the above basically is for xml, you get an idea of the purpose and why they are needed. It is a good idea to get into the habit of adding them now before you absolutely have to.

------------
TallTroll, you statement I quoted below kind of rubs me the wrong way.

"This is far too much to demand of most authors, because they do it part-time, or as a hobby, or can't or won't spare the time to learn how to do it right. For those of you who have already made the effort, or have the sort of brain that doesn't mind, great. For everyone else, forget it

Also, a significant number of designers just flat-out don't know how to do it, or even that they can"

Well, for those that can't, maybe they need to find another job. And for those that won't learn, I hope they do something else. Sould the standards be lowered just so everyone can do it, that justs lowers the potential and quality of the internet. Would you want to get on an operating table and have a doctor who just barely passed do his thing on your heart. Or would you get on a plane worked on by an ignorant person. I know these are extreme examples, but I hope you get my point. Don't just get by in life man, go for the brass ring.

Personally, I don't want the net to just be a bunch of pages with everything centered down the middle of the page with cute bunny rabbits running across the page and way too many little animated gifs. Making things a little tougher does make those that can usually strive a little harder which produces quality and achieves a higher level of greatness. I want to see what is possible, not just what can be easily done. If those in the past had only done what was easy, you would not have a car to drive. In fact you would not even have computers or an internet.

Eric_Jarvis

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 11:28 am on Oct 25, 2001 (gmt 0)

Ian_Kelley: "I think someone already said this but... Most of those sites don't validate because the simple truth is that web builders don't design for W3C... they design for browsers."

well yes...but that's the designers that keep asking about problems with their sites each time a new generation of browsers arrive

I prefer to design for the web since I can't be bothered to spend every waking hour learning about each new browser version when it arrives

YMMV

knighty

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 3:07 pm on Oct 25, 2001 (gmt 0)

Way too many people here are assuming that lack of a DTD tag at the top of the page = bad, unprofessional design.

This is simply not the case, of course it is good to get as close to standards as possible but sometimes you have to bemd the rules in order to achieve your objective. By ensuring that every tiny detail passes the validator doesnt mean the your pages are fantastic nor does it mean that you are guarenteed a number one spot in the search engines.

I'm not saying that validating your pages is a bad idea - i think thats great! However there are some people are more concerned about getting thier pages looking good in all browsers and pulling in the punters first. If they have time then they might run it through a validator just to make sure the code is OK.

Xoc

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 4:12 pm on Oct 25, 2001 (gmt 0)

I'll make a reference to this thread, [webmasterworld.com ], where GoogleGuy (Google's representative to WebmasterWorld) responded to this issue.

As far as why Google doesn't validate (saw that in another thread), I know that with 100+ million searches per day, we try to save every byte we can. Having a lean home page saves us moolah.

Yes, when you are getting 100 million hits, it makes sense to try to reduce bandwidth. As with all rules, you should know the rules before you break them.

I don't think it's optional whether you validate a site or not. I do think it's optional on whether you take the advice of the validator for specific reasons.

In Google's case, making their home page validate would add approximately 100 to 200 bytes to their output. Multiply that by 100 million, and yes that does turn into a huge amound of bandwidth for no obvious advantage. The things on their home page where they didn't follow the rules were fairly minor--missing DTD, some unquoted attributes, a missing alt attribute, and some missing type attributes.

But realize that Google is in a unique position on the web. As the #1 search engine, they don't have to worry about getting their site indexed by search engines.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 1150 posted 5:21 pm on Oct 25, 2001 (gmt 0)

Here's another thought: designing pages that validate helps move the industry closer to the day when there won't be major cross-browser differences. Problems have been built right into the major browsers just to try to render something from sub-standard HTML. That practice helped the web explode in the beginning, but it also created some lousy work habits for authors.

Every time I eliminate non-validating code, I've just struck one small blow against the browser incompatibility problems.

I use the built-in validator in Homesite, which is great for letting me customize the validation criteria. I run it on every page while I work as a matter of habit, even if I'm just making small tweaks. And I especially validate before I save the page. (I dig this validator because it will check unsaved pages right in the middle of a workflow -- a nice feature)

This habit saves me lots of aggrivation by catching little bugs, cut and paste errors, etc, before they get wrapped into my page and forgotten -- and especially before I start screaming about "those buggy browsers!"

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 1150 posted 5:23 pm on Oct 25, 2001 (gmt 0)

<added> getting a page to validate is also a great way to learn more. A validator is like having a teacher right at your elbow. </added>

gmiller

10+ Year Member



 
Msg#: 1150 posted 9:20 pm on Oct 25, 2001 (gmt 0)

Pages that aren't validated tend to be rather fragile... and fragile pages are job security for the people who get paid to fix their own mistakes.

Of course, my pages don't validate either, because ad networks and statistical monitoring services prohibit me from fixing the errors in their code or porting it to XHTML. Complaining about it just gets me an "It works right now, so who cares about next week?" response.

Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 1150 posted 6:10 am on Oct 26, 2001 (gmt 0)


The only thing you can do is make your code
meet the standard and hope that they wrote the
search engine spider to do the same. If that
doesn't scare you into doing it right, nothing will.
(-Xoc 10-24-2001)

That's a good point that needs emphasizing. More people do need to think in terms of a spider being a browser.
Just as the browser downloads and parses the page content to display to you - so does the spider download content to "display" to the indexer.

>Is there some technical reason not to validate?

I've had this site validated many times. I botched while doing what? Working around IE6 errors because of the doctype.

Xoc

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 6:28 am on Oct 26, 2001 (gmt 0)

In the latest MSN update that just happened, they switched to using XHTML. They almost validate. They still made a couple of mistakes.

Xoc

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 9:13 pm on Oct 28, 2001 (gmt 0)

Hmmmm. For some reason MSN is back to HTML, no DTD, and many more HTML errors.

TallTroll

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 11:08 am on Oct 29, 2001 (gmt 0)

Google not reading XHTML links? [webmasterworld.com]

Then again, maybe they don't validate correctly ;)

Can anyone shed light on other spiders that may have trouble with XHTML links? Or even other code types (XML, or the swanky new Flash link indexing etc)

<devils advocate>
This also raises an interesting question in my mind:

If you were running an SE, would you emphasise indexing for quality (making the spider conform to published standards, and reject poorly coded sites), or quantity (spend time figuring out how to get it to read content and links, no matter how badly scrambled... perhaps at the expense of the comparatively small number of well coded sites that DO validate, due to lack of dev time)

Googles home page currently claims an index of 1,610,476,000 pages. Comments?
</devils advocate>

Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 1150 posted 11:35 am on Oct 29, 2001 (gmt 0)

That's another thread Troll - start another one if you want.

They Switched back Xoc because of so much upset [webmasterworld.com...]

They really took it on the chin over that. It threatened to over shadow XP entirely.

Xoc

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 1150 posted 4:29 pm on Oct 29, 2001 (gmt 0)

I responded about that in that [webmasterworld.com] thread. XHTML is not the issue.

I'll post the link from that thread here, too, [opera.com ], because it is relevant here.

The irony of Microsoft's claim to standards-support is complete when you check the MSN.com site for compliance with the XHTML standard. Anyone can go to the W3C's standards validation service at [validator.w3.org...] and type in www.msn.com. The document returned demonstrates clearly that not a single document on their site adheres to W3C specifications, and many of their documents do not use XHTML at all, e.g. [careers.msn.com...]

Also, I looked at more detail at the AOL site. Can you believe that they have their <title> tag outside the <head> section of their HTML? It's true! And that is the #1 web property on the Internet! I find that repulsive.

This 32 message thread spans 2 pages: 32 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / HTML
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved