homepage Welcome to WebmasterWorld Guest from 54.196.181.109
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Code, Content, and Presentation / CSS
Forum Library, Charter, Moderator: open

CSS Forum

    
Does validation failure force quirks mode in IE6?
I don't want quirks mode
dtleahy




msg:3152686
 4:33 pm on Nov 10, 2006 (gmt 0)

Hi,

I have 3 questions, which I made bold within the post. Any help with any or all of these questions will be greatly appreciated.

I want IE6 to render my ASP pages the same as FireFox 2. Well, close anyhow. Same pages render very differently (everything "works" in FireFox, but there are problems in IE6.)

From what I have read, I don't want to trigger quirks mode in IE6, or else the page will render as if the browser is IE 5 (or worse.)

1.) How do you know for sure if a page has triggered quirks mode? When I try to validate the page using the W3 validator, I get, "This page is not Valid XHTML 1.0 Transitional!"

In Dreamweaver MX, I chose XHTML compliance (XHTML DOCTYPE, rather than HTML DOCTYPE) for all of my ASP pages. Why? I don't know any better. I'm unable to figure out when to choose which doctype, in spite of reading up on it a bit. 2.)Why choose XHTML 1.0 Transitional versus HTML 4.01 Transitional?

When I try to validate the XHTML, I get validation errors. 3.) Do validation errors automatically trigger quirks mode in IE6? (Some of my coding "errors" are intentional, such as intentionally leaving out the ALT tag on image elements that are structural, not something for readers or where I want a tooltip. If I absolutely MUST repair every error that the W3C validator finds, just to make sure I'm not triggering quirks mode and spoiling the IE6 rendering, I will do it - but only with that gun to my head.)

Thanks!

Dennis

 

DanA




msg:3152736
 5:12 pm on Nov 10, 2006 (gmt 0)

1) the complete doctype is not the first html code sent to the browser.
2) Why not choosing html 4.01 strict or xhtml 1.0 strict?
3) problems with the validator do not trigger quirks mode, they may only trigger wrong interpretations of the code.

Fotiman




msg:3152737
 5:12 pm on Nov 10, 2006 (gmt 0)


1.) How do you know for sure if a page has triggered quirks mode? When I try to validate the page using the W3 validator, I get, "This page is not Valid XHTML 1.0 Transitional!"

If you don't have a complete DOCTYPE declaration, including the URI, then IE will be in quirks mode. If you use a fully qualified DOCTYPE, then IE will not be in quirks mode. Any of these will cause IE to get out of quirks mode:


<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">


2.)Why choose XHTML 1.0 Transitional versus HTML 4.01 Transitional?

Most developers should NOT be using XHTML. If you're not sure if you need XHTML, then you don't need it (you would know if you did). There's a good read here on the topic:
[hixie.ch...]


3.) Do validation errors automatically trigger quirks mode in IE6?

No.

mattur




msg:3152744
 5:17 pm on Nov 10, 2006 (gmt 0)

1.) How do you know for sure if a page has triggered quirks mode?

See FAQ: Choosing the best doctype for your site [webmasterworld.com] and the reference links for info about doctypes and triggering layout modes. The Firefox Web Developer extension toolbar displays a tick icon to indicate rendering mode, along with lots of other useful features. Install it now!

2.)Why choose XHTML 1.0 Transitional versus HTML 4.01 Transitional?

Unless you have a specific reason for using XHTML, you should use HTML4.01 Trans or Strict. See Why most of us should NOT use XHTML [webmasterworld.com]

3.) Do validation errors automatically trigger quirks mode in IE6?

No, but HTML errors can cause layouts to go wonky (e.g. failing to close an element).

You should use the W3C validator to QA your markup, and fix any errors. It's the first step in debugging layout problems.

It's not difficult to write valid code once you've got the hang of it, and it will save you time and effort in the long run. You may also find this article useful Bulletproof HTML: 37 Steps to Perfect Markup [sitepoint.com]

(BTW: don't leave out the alt attribute where you don't need it, set it to blank: alt="")

dtleahy




msg:3152953
 7:53 pm on Nov 10, 2006 (gmt 0)

Thanks very much DanA, Fotiman, and mattur,

Great, succinct info and great links!

I have altered the Dreamweaver template to reflect the switch sitewide from XHTML to HTML.

The current declaration:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">

After updating and republishing the page, validation went from around 200 errors and warnings, down to a handful. I fixed those (including alt=""), and have only a single error left. My <form> has no "ACTION", because onSubmit the form is processed by a javascript validation routine, which in turn submits the form if there are no client-side errors found. I'm not sure I can kill that error... (maybe something as simple as: ACTION=""?)

So, this has been an excellent lesson in doctypes, gave me a better understanding of what triggers quirks mode, and shows me that I need to look elsewhere (in code) for a reason why the page renders properly in FireFox 2.0, and incorrectly in IE6.

Thanks again!

Dennis

Fotiman




msg:3152962
 8:00 pm on Nov 10, 2006 (gmt 0)

Note, you should not rely on JavaScript form validation. That is, you should always include an action and assume that the client does NOT have JavaScript enabled. Any client side validation can then be added as an enhancement, but you should never assume that the client will hava JavaScript enabled. Validate your form server side.

rocknbil




msg:3153158
 11:41 pm on Nov 10, 2006 (gmt 0)

Although Fotiman's recommendation is more important, you can squelch this validation error as you said: ACTION="".

If you're puzzling how to do this AND use client-side Javascript, the answer is to return false from your validation routine. If Javascript is active, it will stop the submit, if it's disabled, users can proceed as intended.

<input type="submit" onClick="return someValidation(this.form);" value="submit">

function someValidation(form) {
// validation code
if (validationOK) { form.submit(); }
return false;
}

Another is return false on the form
<form action="" onSubmit="return false;">

Another grain of info on quirks mode: in FireFox, right click on a page (including THIS forum! :-) ) anywhere not on an element and select View Page Info.

This will reveal the render mode of the page - it will clearly say Quirks mode.

dtleahy




msg:3155592
 10:03 pm on Nov 13, 2006 (gmt 0)

Fotiman,

Yes, I was/am only using the JavaScript validation on the client-side as the initial form validation. When it goes server-side, it all gets checked again. In fact, since JavaScript is not a language I know well, I was not really sure how to scrutinize the characters that users are allowed to enter into text fields, to make sure they are not inputting "drop table users" or something like that. I do know how to do this server side, and I am doing so.

rockinbil,

Thanks for the excellent added information. I will implement that "ACTION=...", and that should be the final hurdle to get the page to validate.

Interestingly, my CSS had a couple of dumb mistakes, and that is what made me think the page was triggering quirks mode. Now I know it wasn't, but I gained valuable knowledge about DOCTYPEs as well as what quirks mode really is.

Thanks again!

Dennis

Fotiman




msg:3155628
 10:43 pm on Nov 13, 2006 (gmt 0)


Yes, I was/am only using the JavaScript validation on the client-side as the initial form validation. When it goes server-side, it all gets checked again.

Actually, no it won't... because it will never get to the server side the way you have it now. If you don't have an action attribute, then the form will never submit. You could do this:

action=""

Which essentially means submit the form back to itself. But if you don't have any action, then the form will never sumbit when the user has JavaScript disabled. So make sure your form has an action.

dtleahy




msg:3155838
 3:20 am on Nov 14, 2006 (gmt 0)

{quote]But if you don't have any action, then the form will never sumbit when the user has JavaScript disabled. So make sure your form has an action. [/quote}

Hi Fotiman,

Point well taken.

However, (and this is probably a whole other discussion), I am one of those people that does not believe that every site has to "work" without javascript, or with cookies disabled, or display well at 320x200 or whatever tiny resolution. I can see where it is in the site owners best interest for many sites, maybe even most sites to "work" even when the visitor has crippled or disabled some functionality, or is using a reader, or has a tiny device - but not every site scales well or is very useful with features crippled. And, in forcing the design to scale to work on a tiny device and/or with features disabled, the site owner may not be getting a site that "works" really well for 99% of their target audience. This is something I discuss with a site owner first.

Take care,

Dennis

Fotiman




msg:3156272
 3:33 pm on Nov 14, 2006 (gmt 0)


but not every site scales well or is very useful with features crippled. And, in forcing the design to scale to work on a tiny device and/or with features disabled, the site owner may not be getting a site that "works" really well for 99% of their target audience. This is something I discuss with a site owner first.

No offense intended here, but honestly this just points at a lack of understanding. It is entirely possible to design a site that will scale without problems when JavaScript is not available. Just because a site is accessible to all does NOT mean that it has to not "work" really well for 99% of the target audience.

I would encourage you to do some research on "unobtrusive javascript" and "progressive enhancement".

dtleahy




msg:3156474
 6:36 pm on Nov 14, 2006 (gmt 0)

I would encourage you to do some research on "unobtrusive javascript" and "progressive enhancement".

Thanks, Fotiman, I have bookmarked a few sites, and checked out several. I have to say that in the big picture, unobtrusive JavaScript is probably the way to go, but (maybe due to my ignorance) it still seems like the site owner will be forced to pay for more development time to implement a fully functional site for the visitors that choose to browse without using JavaScript.

A few comments:

Scalability: working from the lowest common denominator
Maybe it is ignorance on my part, but it seems like there is a much bigger investment in time for a site that serves up a set of pages that will scale well on a cellphone browser and still look world-class when served on desktop PCs. I would think it might be cheaper for the web site owner to pay for a different set of pages for the micro browser crowd. (My gut reaction to the current crop of micro devices is that they are "quaint", like a Commodore 64. However, it probably won't be long before they grow up, and then web developers will not only need to include them as the lowest common denominator, we will want to.)

Cookies optional?
What about the site visitor to your ecommerce site that prefers to surf with cookies disabled? I suppose there are ways to code everything server side, but again, this extra development time has to be paid for by the site owner - are they getting a smart return on their investment?

JavaScript: vital, fluff, or somewhere between?
Why has it become best practice to build sites that will degrade gracefully in every respect if the user turns JavaScript off? What about the user that decides to turn images off or sound off or CSS off? I wouldn't want to project manage a silent music site, or a text-based art site. I just never understood why JavaScript seemed to be the first technology to be deemed "fluff" by many, and expendable. (Well, it may have been the first, but Flash fell harder.)

Again, this could be ignorance on my part, and there may be ways to code every site function on the client side without using JavaScript, and I just don't know how to do it. I'll give an example: I have an ecommerce site in progress where the user can search (filter) and sort (order) results. I give the user a primary and a secondary sort, presented in 2 clusters of radio buttons. When the user selects a radio button from one cluster, the illogical choices are disabled in the other cluster. This is done with a JavaScript function. With JavaScript turned off, the user could choose to order the results first by price descending, and second by price ascending. The interface thus makes no sense to the user - they don't know how the results will be sorted. Of course, my program logic cannot allow a query like that, so I will force a default of ignoring the secondary sort. This happens in the background, and the user would not know why the results are not sorted by ascending price (which may have been his intent.)

This is not a vital, life-or-death example, but it is fresh in my mind. The only way I know how to do it client side is by JavaScript, and it seems like a crazy thing to do server side (which would probably be painfully slow on a cellphone browser.)

============================================

Again, I would think that the decision to pay for the extra development time to make sites accessible on micro devices, or to remain "fully" functional without JavaScript, cookies, CSS, a video screen or monitor, would be a decision for the site owner.

Dennis

rocknbil




msg:3156839
 2:28 am on Nov 15, 2006 (gmt 0)

However ... I am one of those people that does not believe that every site has to "work" without javascript, or with cookies disabled ...

Just wanted to make it clear I voiced the solution to squelch the validation error, and that only. Display issuses can be argued ad infinatum. But functional issues, yes, you need to address these. A site SHOULD function well without Javascript or cookies, **especially** a form.

Typical users who don't understand these technologies only hear one thing amid the hysteria of viruses, spyware, and How Your Hard Drive Gets Erased: "To be safe, TURN IT OFF!" More and more users are surfing with these disabled. So you may overlook display, but don't overlook function. It needs to work without Javascript.

It's really pretty easy. Just understand that "return false" will tell the browser "don't do what you normally do, let Javascript manage it." If Javascript is disabled, then the browser control takes over.

swa66




msg:3156860
 2:56 am on Nov 15, 2006 (gmt 0)

2.)Why choose XHTML 1.0 Transitional versus HTML 4.01 Transitional?

Unlike some earlier posters I do believe there is real value in xhtml 1.0 transitional. First it gives you more warnings on errors in the validator (I religiously check any page after every edit).
Sure enough MSIE runs decades behind, but that's no reason to stick with older standards.

And it allows real xml oriented tools to be used.

The documents comparing HTML4 vs. xhtml typically compare against a strict xhtml and there I agree: the browser side is not ready for that.

But if you do: make sure to be very strict about checking your pages and the doom scenarios they depict don't hold for you.

A browser should not dump you in quirks mode on a strict xhtml doctype, no matter what (it should pop an error if something is wrong), but quirks more is ingrained due to the past where e.g. images aligned with text baselines and transition was needed, so you still get it every so often.

Basically I add a link to the bottom of my pages pointing to [validator.w3.org...] that way verifying for yourself is an easy habit just after you view the page.

Fotiman




msg:3157395
 3:39 pm on Nov 15, 2006 (gmt 0)


I have to say that in the big picture, unobtrusive JavaScript is probably the way to go, but (maybe due to my ignorance) it still seems like the site owner will be forced to pay for more development time to implement a fully functional site for the visitors that choose to browse without using JavaScript.

I can understand why you might think that, but once you get in the habit of developing in an unobtrusive way I think you'll find that development times may even go down. This line of thinking might be true for "graceful degradation", where development prioritizes presentation, and then has to dumb it down. "Progressive Enhancement", on the other hand, keeps the content as the main focus, and then adds on enhancements. The content is what's really important anyway. Just pretend you're building a website for 1990 to start, and then enhance it once the content is in place. :)


Scalability: working from the lowest common denominator
Maybe it is ignorance on my part, but it seems like there is a much bigger investment in time for a site that serves up a set of pages that will scale well on a cellphone browser and still look world-class when served on desktop PCs.

(same argument as above)


I would think it might be cheaper for the web site owner to pay for a different set of pages for the micro browser crowd. (My gut reaction to the current crop of micro devices is that they are "quaint", like a Commodore 64. However, it probably won't be long before they grow up, and then web developers will not only need to include them as the lowest common denominator, we will want to.)

It's quite easy to develop for the lowest common denominator. Just use pure, semantic markup to describe your content. The markup is the foundation. Start with a clean slate, and don't add any presentation (CSS) or behavior (JavaScript) until the core foundation is in place.

For example, suppose I wanted to create an article. The core foundation might look something like this:


<div class="articlehead">
<h1>Progressive Enhancement</h1>
<h2>Easier Than You May Think</h2>
<p class="dateposted">11/15/2006</p>
<p class="author">Fotiman</p>
</div>
<div class="articlebody">
<p>...</p>
<p>...</p>
</div>

This is pure markup. From there I can add presentation and behavior. For example, maybe I want to add some special animation effect so that article body appears like a window shade (slides down). Instead of adding that script directly inline, I would keep my markup clean and just attach it via an external script. There was no additional cost to create the initial semantic markup, and now I've got something that works for all types of browsers and can be easily enhanced.

Separation of content, presentation, and behaviors is a good goal.


Cookies optional?
What about the site visitor to your ecommerce site that prefers to surf with cookies disabled? I suppose there are ways to code everything server side, but again, this extra development time has to be paid for by the site owner - are they getting a smart return on their investment?

Typically, I think you'd be better off to use session variables instead of cookies. Session variables are basically the same thing as cookies, but they only last the duration of the session and are not written to a file. Users can disable cookies but still use session variables.


JavaScript: vital, fluff, or somewhere between?
Why has it become best practice to build sites that will degrade gracefully in every respect if the user turns JavaScript off?

Ah, but it hasn't! Note, you said "degrade gracefully". Progressive Enhancement, though similar, puts the focus on the content first. Here's my favorite quote regarding Progressive Enhancement vs. Graceful Degradation:

These two concepts influence decision-making about browser support. Because they reflect different priorities, they frame the support discussion differently. Graceful degradation prioritizes presentation, and permits less widely-used browsers to receive less (and give less to the user). Progressive enhancement puts content at the center, and allows most browsers to receive more (and show more to the user). While close in meaning, progressive enhancement is a healthier and more forward-looking approach.
- From the Yahoo article on Graded Browser Support:
[developer.yahoo.com...]


What about the user that decides to turn images off or sound off or CSS off?

This is why you start with pure sematic markup. The site should still make perfect sense even when it can't be enhanced with CSS. And images should have alt tags so they can still be shown when the browser can't show images.


I wouldn't want to project manage a silent music site, or a text-based art site.

Why not? If the content is a picture of artwork, then provide pictures of artwork with meaningful alt text. For example:
<img alt="Painting of a rose" src="rose.jpg">

Just because the user can't see the picture, doesn't mean he/she shouldn't be able to tell what it is. Maybe you want to enhance the site so that the pictures appear in a nice slide show, but your core markup will still be accessible.


I just never understood why JavaScript seemed to be the first technology to be deemed "fluff" by many, and expendable. (Well, it may have been the first, but Flash fell harder.)

Probably because there are a larger percentage of user agents (web browsers, search engines, etc.) that have JavaScript disabled (or no JavaScript capabilities at all) than there are user agents that have images or CSS disabled. Don't think of it as fluff though... think of it as enhancement.


Again, this could be ignorance on my part, and there may be ways to code every site function on the client side without using JavaScript, and I just don't know how to do it. I'll give an example: I have an ecommerce site in progress where the user can search (filter) and sort (order) results. I give the user a primary and a secondary sort, presented in 2 clusters of radio buttons. When the user selects a radio button from one cluster, the illogical choices are disabled in the other cluster. This is done with a JavaScript function. With JavaScript turned off, the user could choose to order the results first by price descending, and second by price ascending. The interface thus makes no sense to the user - they don't know how the results will be sorted.

The interface still makes sense. But the user has chosen to make a selection that doesn't.


Of course, my program logic cannot allow a query like that, so I will force a default of ignoring the secondary sort. This happens in the background, and the user would not know why the results are not sorted by ascending price (which may have been his intent.)

If the user's intent was to sort ascending, then he/she should not have selected both ascending and decending. But lets go ahead and assume the user has JavaScript disabled, and that they suffer from a low IQ. :-) You could, by default, have a short paragraph of text to explain to the user what will happen if they pick 2 contradictory search parameters. The enhancement, then, would be to:

1. Add JavaScript to disable illogical choices
2. Add JavaScript to remove the explanation paragraph, since it's no longer needed.


This is not a vital, life-or-death example, but it is fresh in my mind. The only way I know how to do it client side is by JavaScript, and it seems like a crazy thing to do server side (which would probably be painfully slow on a cellphone browser.)

In this particular case, a simple explanation to the user what will happen if they make idiot choices should suffice. But you could also have the server check to see if the user made idiot choices and redirect them back to the form with a message that says "Don't be an idiot! You can't sort that field both Ascending AND Descending! Choose again!". If it's happening server side, it's not going to be any slower for a cellphone browser than from a PC browser, so I'm not sure what you mean by that comment.


Again, I would think that the decision to pay for the extra development time to make sites accessible on micro devices, or to remain "fully" functional without JavaScript, cookies, CSS, a video screen or monitor, would be a decision for the site owner.

And again, there is no added development time. It's a change in the way you handle the development process. By creating pure markup right up front, you're creating a fully functional site right from the start.

Fotiman




msg:3157411
 3:52 pm on Nov 15, 2006 (gmt 0)

swa66

First it gives you more warnings on errors in the validator (I religiously check any page after every edit).

Only if you are creating more warnings or errors. For example, in HTML this is fine:

<img src="foo.png" alt="foo" >

That's not an error. If you convert to XHTML, then it becomes an error because it's not closed. That doesn't make it an error in HTML though. While I generally like the fact that XHTML is a stricter language, creating HTML pages that don't adhere to XHTML strictness does NOT mean they have errors or warnings.


Sure enough MSIE runs decades behind, but that's no reason to stick with older standards.

It is, if you want to properly serve your pages to IE users.


The documents comparing HTML4 vs. xhtml typically compare against a strict xhtml and there I agree: the browser side is not ready for that.

I have no clue what you mean by that. Strict simply means that the page does not include certain presentational markup... it has NOTHING to do with browser readiness! Browsers can easily display either Strict or Transitional documents. Can you explain what you mean?


A browser should not dump you in quirks mode on a strict xhtml doctype, no matter what (it should pop an error if something is wrong), but quirks more is ingrained due to the past where e.g. images aligned with text baselines and transition was needed, so you still get it every so often.

Wrong. You will only get quirks mode if you don't specify a complete DOCTYPE, as I mentioned in a previous post. Doesn't matter if it's Strict or Transitional.

If you're serving your pages as type text/html, then you should be using an HTML DOCTYPE, not XHTML. And if you want to serve XHTML, you should be serving it as type application/xhtml+xml, which will not work for IE users. It's plain to see, 99.9% of the time you should be using an HTML DOCTYPE.

Again:
Why most of us should NOT use XHTML [webmasterworld.com].

[edited by: Fotiman at 3:56 pm (utc) on Nov. 15, 2006]

swa66




msg:3157974
 11:51 pm on Nov 15, 2006 (gmt 0)

If you're serving your pages as type text/html, then you should be using an HTML DOCTYPE, not XHTML. And if you want to serve XHTML, you should be serving it as type application/xhtml+xml, which will not work for IE users. It's plain to see, 99.9% of the time you should be using an HTML DOCTYPE.

And who are YOU to tell ME what standard to follow, I'm not telling you what to do, so stop telling me (and others while at it) what to do.

FWIW: the documents you quote/point to presume that people will not recheck their documents when they switch from text/html to application/xhtml+xml, and presume they do not check them even when served as text/html.
Where is the proof supporting the statement that they do not check their pages and will not do so when switching?

Both approaches work just fine, use what works for you.
Live and let live.

dtleahy




msg:3158665
 4:02 pm on Nov 16, 2006 (gmt 0)

Fotiman,

Thank you very much for taking the time to give me a detailed explanation. It is appreciated!

On the next site I build, I'm going to try to embrace a progressive enhancement mindset. I'm in pretty deep on my current project (an ecommerce site) to make the switch on this one. I will, however, make sure the site is fully functional if a user does not have javascript enabled, and I am using session variables.

I bring the baggage of the mindset of a desktop developer with me, and I am still really a newbie at web development. But, I'm not too old to learn new tricks.

Unfortunately for anyone searching through these threads, this thread contains about 3 different threads, but I guess I caused that by asking 3 questions. (Each had important tangents.)

Thanks again!

Dennis

Fotiman




msg:3158732
 4:52 pm on Nov 16, 2006 (gmt 0)

swa66, continue on serving your pages however you'd like. It's not a matter of me telling you which standard to use. It's a matter of knowing what your content is and using the appropriate DOCTYPE for that content. Since there are almost no legitimate cases to use XHTML, I stand by my claim that 99.9% of web developers should be using the HTML 4.01 DOCTYPE.


FWIW: the documents you quote/point to presume that people will not recheck their documents when they switch from text/html to application/xhtml+xml, and presume they do not check them even when served as text/html.

How are you serving your XHTML pages now? Are you serving it as text/html? If so, then you've just admitted that your page contains HTML, in which case the logical DOCTYPE to use is the HTML DOCTYPE.

If you are serving your XHTML pages correctly, as application/xhtml+xml, good for you... no one using IE will be able to see your content, but presumably you don't care about that.

If you want to argue a good case for using the wrong DOCTYPE, I'm all ears. But as of yet, I've not heard any reasons why it would make sense.

swa66




msg:3158893
 6:47 pm on Nov 16, 2006 (gmt 0)

Although we're hijacking a thread in a forum that's about CSS, not HTML. But I'll try to focus on just the CSS related parts of the reasons:

One of the things I really do not like about HTML4 is the implied closing of tags like below:

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
_<head>
__<title>Title</title>
__<style type="text/css">
___p ol {color: green}
__</style>
_</head>
_<body>
__<p>
___ipso 1
___<ol>
____<li>no green here!
___</ol>
___ipso 2
__<p>
___ipso 3
_</body>
</html>

This validates as HTML 4.01, and the CSS validates, but the <li> stays unstyled. Cause there is a </p> implied before the <ol> and therefore there never will be a valid target for the css. Yet not even a warning of the implied tag from the validation.

Now I know that the tagsoup above is wrongly indented, but there is nothing that will tell me so, nor even give me a hint about that.

Neither is there a warning that ipso 2 isn't in a paragraph at all and as such will not respond to
p {color: red}
in the stylesheet either.

Validate it as xhtml and you get warnings that you cannot have a <ol> in a <p> and the author immediate sees his mistaken indentation and can fix it as well.

If they use html4 ... they come here and ask for help cause it's impossible to figure out why the thing isn't listening to their CSS.

If that means browsers (and they all do without complaining) have to eat a <img ... />, so be it. Moreover I see it as a incentive for the browser makers to start to support it better.

The original basis of HTML was to force authors to use correct syntax and to force browsers to be liberal about what they accept and render. For me using _validated_ xhtml delivered as text/html is a perfect example of that philosophy.

Apologies to the poster who asked help understanding why his css didn't work using something similar to the above for shamelessly inspiring my example on it.

Fotiman




msg:3158929
 7:21 pm on Nov 16, 2006 (gmt 0)

swa66, I appreciate your argument. In fact, I too was thankful for the explicit closing tag requirements that XHTML added, and it's strict interpretations. It may surprise you to know that I jumped on the XHTML bandwagon long ago, for pretty much the same reason you've given above.

However, with time and experience comes a better understanding. I now know that lists can't be contained in <p>'s. I no longer need the XHTML validator to tell me that, because experience has taught me not to do it. I did find that working with XHTML forced me to create cleaner code, but now I no longer need to rely on the validator for that purpose.

I understand your argument. I hope, though, that someday you will come to rely less on the validator and more on your own skills, and that you'll switch back to HTML. The lessons you've learned from working with XHTML can still be applied to HTML.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / CSS
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved