homepage Welcome to WebmasterWorld Guest from 54.196.199.46
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / WebmasterWorld / Professional Webmaster Business Issues
Forum Library, Charter, Moderators: LifeinAsia & httpwebwitch

Professional Webmaster Business Issues Forum

This 32 message thread spans 2 pages: 32 ( [1] 2 > >     
Why XHTML 1.1/CSS?
I know it's better, but...
mep00




msg:793767
 9:38 pm on Jan 13, 2004 (gmt 0)

Let's face it, I know it's better, you know it's better, but how does one convince thier client/employer that it's better?

I receintly read an interview at WaSP with Ryan Carver. He points out that "...search engine benefits greatly outweighed 1% of the audience seeing an unstyled (but still completely usable) site." Some other reasons might include better accessablity, easier maintainence, faster loading/less bandwidth, and future compatiblity.

Those are all good reasons, but other than SEO, it might be hard to convince the bean counters. What other reasons can be given to convince them? My main interest is in how to convince the small guy why my clean code is worth spending more money on than the other guy's garbage with a pretty face, but more general reasons are also worth noting.

As far as the "pretty face" goes, that's what sub-contractors/freelance designers are for.

[edited by: Travoli at 2:27 pm (utc) on Jan. 14, 2004]
[edit reason] delinked [/edit]

 

danieljean




msg:793768
 3:16 am on Jan 14, 2004 (gmt 0)

I like all the reasons you gave. You could flesh some of them out for the bean counters:

-Accessibility: few websites do a good job of serving the needs of the visually impaired, so there's a market that will likely stay very loyal. There is also a regulatory risk: if government decides you need to make your site accessible (or if courts suddenly decide you have to), it will cost you.

-Maintainability: usually 80% of the cost of an IT project. A 10% difference represents 40% of the initial cost of the project. Probably much less for web-content, though.

-Bandwidth: meh, 1G is pretty cheap these days. However, download time is still crucial, and my compliant pages usually load 1-2 seconds faster on a modem. If you're hovering around 8 seconds, that can mean not losing an estimated 33% of surfers (though this only applies to those using modems).

-future compatibility: it's likely that in 1-2 years the browser landscape will change. Those that don't think so have a very short memory or haven't been around long: remember when Nestcape was the dominant player?

I'll add one more: making web-authors happy. The only competent web-authors I know will put up a fight if you so much as suggest that pages don't need to validate. As a programmer, I like to work with languages that have a future, so I understand them: you couldn't pay me enough to work with some of the sh*t out there.

The cost of turnover can be estimated- and it is usually quite high. Try Googling "turnover cost estimates." An oft-used rule of thumb is one year's salary. An employee postponing that decision by 1 year means a business can use that money for something else. Assuming a business can find a way to invest it at 10% and the salary of the web-author is $50k, that's still $5k.

--

Basically, adding those to the search engine optimization mentionned by Carver, I can't understand why a business would choose not to adhere to standards.

[edited by: danieljean at 3:53 am (utc) on Jan. 14, 2004]

pageoneresults




msg:793769
 3:26 am on Jan 14, 2004 (gmt 0)

Maybe this one will add some fuel...

Buy standards compliant Web sites [w3.org]

DrDoc




msg:793770
 5:37 pm on Jan 14, 2004 (gmt 0)

Good link, pageone ;)

Hagstrom




msg:793771
 9:40 am on Jan 28, 2004 (gmt 0)

As I see it, only half the question has been answered - namely "Why CSS?"

But how about the first half of the question "Why XHTML 1.1?". Are there any benefits from XHTML as compared to strict HTML?

davidpbrown




msg:793772
 10:13 am on Jan 28, 2004 (gmt 0)

As I understand it, the xml+xhtml MIME headers switch the browsers into strict mode and do not allow them to compensate for sloppy coding.. that is the page falls over with errors.

My guess is that this might be useful if you're generating code from XML source and want to know that what your getting is what you intended.

Hagstrom




msg:793773
 10:24 am on Jan 28, 2004 (gmt 0)

As I understand it, the xml+xhtml MIME headers switch the browsers into strict mode and do not allow them to compensate for sloppy coding.. that is the page falls over with errors.

But isn't this true for HTML 4.1 strict as well?

davidpbrown




msg:793774
 10:28 am on Jan 28, 2004 (gmt 0)

Not sure but expect your right.

Maybe the question should be more.. what will the benefits of XHTML 2.0 be that XHTML 1.1 is apparently a step towards?

Also, HTML 4.1 isn't XML, which may be significant.

[edited by: davidpbrown at 10:30 am (utc) on Jan. 28, 2004]

pageoneresults




msg:793775
 10:30 am on Jan 28, 2004 (gmt 0)

It is but, XHMTL is a much stricter level of compliance.

  • XHTML elements must be properly nested
  • XHTML documents must be well-formed
  • Tag names must be in lowercase
  • All XHTML elements must be closed

Also, if you are running any XML applications, XHTML is a must. If you are not, then HTML 4.01 Strict will suffice. I like to validate to XHTML 1.1 when possible. Most of your HTML 4.01 Strict will validate XHMTL 1.1 with a few minor tweaks.

Hagstrom




msg:793776
 11:21 am on Jan 28, 2004 (gmt 0)

> Also, if you are running any XML applications, XHTML is a must

Yes, that's the only argument I have heard / found so far: You need XHTML if you want to use XSLT.

Personally I think the code looks ugly ( </p>, </li>, <img.../> and so on) so I'll try strict HTML 4.1 instead.

Thanks.

pageoneresults




msg:793777
 12:18 pm on Jan 28, 2004 (gmt 0)

You're still going to need those closing tags to validate HTML 4.01 Strict...

</p>, </li>

That's the whole point behind Strict. Writing absolute valid code, no shortcuts, no hacks, just html in its purest and error free form.

DrDoc




msg:793778
 4:29 pm on Jan 28, 2004 (gmt 0)

Strict = saves some serious guess work from the browser...

DrDoc




msg:793779
 4:31 pm on Jan 28, 2004 (gmt 0)

Here's W3C's own answer to the question: "Which should we use, HTML or XHTML, and why?"

[webstandards.org...]

pageoneresults




msg:793780
 6:16 pm on Jan 28, 2004 (gmt 0)

Great link DrDoc! A must read for anyone getting ready to make the switch or even, those who have made the switch! And its fresh, October 2003. Those boys and girls over at the W3C have been very active adding new content to their site these days.

mattur




msg:793781
 8:04 pm on Jan 28, 2004 (gmt 0)

As we all know by now;) the transform from good html to good xhtml or vice versa is trivially simple, so the "being ready for the future" claim in the WASP article is just plain FUD. I don't think the "easier to maintain" or "easier to teach points" are particularly convincing either.

So the choice of using xhtml or html comes down to one issue: do you or your audience want/need to use xml tools on your pages.

If not, it's better to use html. IE doesn't understand xhtml when sent "properly" (i.e. correct mime type) and relies on error handling to understand an xhtml page sent "improperly", so at best you can "properly" deliver xhtml to the subset of your audience using Opera and Gecko-based browsers, using browser-sniffing. And that subset won't notice anyway - unless the page stops working.

AIUI "properly" sent xhtml should not be parsed and displayed in a browser if it isn't well-formed. Fine if you can guarantee all your pages are, and will always be, valid and well-formed, but if you have multiple content authors or accept comments, trackbacks etc then your pages could stop working in some browsers without your knowledge. I think this is how Mozilla behaves, but I haven't confirmed it (don't want to spread panic unneccessarily!).

Can anyone confirm that Moz barfs on xhtml that isn't well-formed, when sent with the application/xhtml+xml mime type?

danieljean




msg:793782
 11:19 pm on Jan 28, 2004 (gmt 0)

I can actually infirm that. While working on getting up to XHTML (which only took an afternoon), I had various errors and Moz -Firebird 0.7- handled it despite strict declarations.

Fischerlaender




msg:793783
 11:37 pm on Jan 28, 2004 (gmt 0)

Simply ask you client if they are respecting orthography in their (paper-based) advertisings. If they do then ask them, why they want to appear web-illiterate?

If they (or you) want some additional technical reasons:
* XHTML pages can be processed by XML tools. This means that your web pages can easily be transformed to other formats in the future.
* Standard compliant web pages are more likely to be displayed correctly on PDAs and other alternative web devices.
Both of the above reasons can be written as: XHTML is future compliant.

TheWhippinpost




msg:793784
 2:10 am on Jan 29, 2004 (gmt 0)

It comes down to browser-independence, or more precisely, universal content-delivery - An XHTML strict document is a much much easier, and predictable format to parse, whereas table-formatted-layouts are a nightmare.

As DrDoc states, 'saves some serious guess work from the browser'. Although I'd modify that by saying browsers - current and forthcoming... in all their differin guises.

So the question that faces authors today is whether they want their content to be readable only to an audience using the PC-based IE's of the world, or more widely through various other platforms and applications.

I know from knocking up an app in Perl which grabs info from sites in a niche area that, had those sites been compliant, doing the regex's would'a been a breeze - It's only a matter of time before the view is taken that it's just not worth the time and money making complicated algo's when there are other sites out there that are compliant and easy to parse.

mattur




msg:793785
 11:55 am on Jan 29, 2004 (gmt 0)

danieljean: are you serving the page with application/xhtml+xml or text/html mime type? I've just checked this on Mozilla 1.5. An xhtml page that isn't well-formed xml served with the application/xhtml+xml mime type doesn't render. Moz just displays an error message "XML Parsing Error: not well-formed..."

Fischerlaender: since it is so trivial to transform html to xhtml, should you ever need to use xml tools on html at a later date you can easily do so.

TheWhippinpost:
So the question that faces authors today is whether they want their content to be readable only to an audience using the PC-based IE's of the world, or more widely through various other platforms and applications.

Pure FUD ;)

mipapage




msg:793786
 12:38 pm on Jan 29, 2004 (gmt 0)

What Mattur said: If you don't serve your xhtml with the proper mime type, it gets parsed by the 'tag soup' parser and there are no rendering benefits whatsoever.

Here's a good read Sending XHTML as text/html Considered Harmful [hixie.ch].

If you are running PHP, try this to send the correct doctypes/mime types:

<?php  
//error_reporting(0);
if ( stristr($_SERVER["HTTP_ACCEPT"],"application/xhtml+xml") ) {
header("Content-type: application/xhtml+xml");
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/tr/xhtml11/DTD/xhtml11.dtd">';
} else {
header("Content-type: text/html");
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">';
}

?>



Basilcally, you want to give XML to the the user agents that can handle it, and TEXT/HTML to the user agents that can't.

If sent as XML, XHTML can be seen as a semantically rich subset of XML. If sent as text/html, you are simply sending html. The former gets parsed by an xml parser, the latter by a tag-soup parser.


Let's face it, I know it's better, you know it's better, but how does one convince thier client/employer that it's better?

For me it's as simple as standards - Everyone has standards, so do web designers - would you pay for sub standard work?

Whether it's xhtml1, 1.1, or html4.01, doesn't matter; they are all specifications, if you meet (validate for any of) them you are up to standard, if not, you're writing tag soup.

[edited by: mipapage at 12:44 pm (utc) on Jan. 29, 2004]

danieljean




msg:793787
 12:43 pm on Jan 29, 2004 (gmt 0)

mattur- Firebird 0.7 is what I'm using. I haven't updated the headers sent via Tomcat- I thought the doctype declaration was enough.

If you are going to use Mozilla consider upgrading. Only geeks use that browser for now, and they can be expected to upgrade too.

Since 0.6, Firebird has been an overall superior browser, and I have even instructed my parents to use it instead of IE.
--

TheWhippinpost's comments about extracting data are a good reminder that not everyone is looking at our site with a browser. While it is easy to switch over, those who are XHTML compliant may see their information available on such sites as Froogle more often than others whose pages are harder to parse.

--

All in all, I think that this being such a simple switch, it really doesn't deserve this much anguish. Like CSS and other standards, there are benefits that become obvious only after we use them. I still know some people that refuse to learn CSS, and have spent more time arguing over it than it would have taken to learn it.

And yes, I still routinely deal with designers who can't figure out cross-browser table display bugs. When are we finally going to let go of this kludge?

davidpbrown




msg:793788
 12:56 pm on Jan 29, 2004 (gmt 0)

mipapage, yours was a good suggestion but I'm thinking your script misses the XML line which I understand can cause problems in IE.
<?xml version="1.0" encoding="iso-8859-1"?>

When first approaching this I had trouble when not including echo "\n";

The alt script which includes these is..
<?php
if (stristr($_SERVER["HTTP_ACCEPT"],"application/xhtml+xml")) { $x = "XML";
header("Content-Type: application/xhtml+xml; charset=iso-8859-1");
echo '<?xml version="1.0" encoding="iso-8859-1"?>';
echo "\n";
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">';
echo "\n";
}
else { $x = "normal";
header("Content-Type: text/html; charset=iso-8859-1");
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">';
echo "\n";
}
?>

If you have use for the knowledge of is or isn't XML you can use the x$. eg..
<html xmlns="http://www.w3.org/1999/xhtml" <?php
if ($x == "XML") {echo "xml:";}
?>lang="en-gb">

davidpbrown




msg:793789
 1:06 pm on Jan 29, 2004 (gmt 0)

Also I'm thinking, the suggestion that "Sending XHTML as text/html Considered Harmful" is a little misleading. It's only XHTML 1.1 that should not be served as text/html.

If you're going to do XHTML you might as well declare it as such.
As suggested here [webmasterworld.com] "HTML compatible" XHTML (as defined in appendix C of the XHTML 1.0 specification) may be served as text/html, but it should be served as application/xhtml+xml. This is probably the sort of XHTML you're writing now, so you could go either way."

(This from a parrallel thread in HTML and Browsers
correct DOCTYPE for validating? [webmasterworld.com])

mipapage




msg:793790
 1:09 pm on Jan 29, 2004 (gmt 0)

davidpbrown,

I'm not sure that I totally follow you there, but that XML prologue isn't required. You can leave it out, and it all works fine.

The one thing that I have been meaning to check is this:

<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />

As I do not use the XML prologue, I declare my charset here. Does anyone know if I should be modifying the

text/html;
part of this meta depending on which mime-type I send?


Also I'm thinking, the suggestion that "Sending XHTML as text/html Considered Harmful" is a little misleading. It's only XHTML 1.1 that should not be served as text/html.

I completely agree with you. I thought that he went a bit overboard in that letter.

danieljean




msg:793791
 1:17 pm on Jan 29, 2004 (gmt 0)

I completely agree with you. I thought that he went a bit overboard in that letter.

Not to mention that in internet time, it is dated about a decade ago (September 2002).

davidpbrown




msg:793792
 1:23 pm on Jan 29, 2004 (gmt 0)

mipapage, yes.. otherway round.. it's just I thought you might have it separatly declared. I think there may be a problem with current IE's if you suggest the XML prologue before the DOCTYPE.

You can remove the http-equiv meta tag reference to Content-Type, you have it in your true HTTP header from the php.. though you might want to add charset to it.
I believe the http-equiv is superceeded by true http headers as most servers don't do the conversion of http equivalent meta tags into headers.

mipapage




msg:793793
 2:18 pm on Jan 29, 2004 (gmt 0)

Not to mention that in internet time, it is dated about a decade ago (September 2002).

Sure, but his points are still valid.



davidpbrown,

Thanks! That was something I've been meaning to look into.

DrDoc




msg:793794
 6:13 pm on Jan 29, 2004 (gmt 0)

<?php
//error_reporting(0);
if ( stristr($_SERVER["HTTP_ACCEPT"],"application/xhtml+xml") ) {
header("Content-type: application/xhtml+xml");
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/tr/xhtml11/DTD/xhtml11.dtd">';
} else {
header("Content-type: text/html");
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">';
}
?>

That should be:

<?php
//error_reporting(0);
if ( isset($_SERVER["HTTP_ACCEPT"]) AND stristr($_SERVER["HTTP_ACCEPT"],"application/xhtml+xml") ) {
header("Content-type: application/xhtml+xml");
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/tr/xhtml11/DTD/xhtml11.dtd">';
} else {
header("Content-type: text/html");
echo '<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">';
}
?>

mipapage




msg:793795
 6:52 pm on Jan 29, 2004 (gmt 0)

Thanks DrDoc.

Why would I add that? Maybe it's obvious, but I'm a green php'er!

mep00




msg:793796
 9:20 pm on Jan 29, 2004 (gmt 0)

Why would I add that?

It does two things in the event HTTP_ACCEPT isn't set: prevents an error message and forces a default condition (the "else" clause). What I don't understand is why HTTP_ACCEPT wouldn't be set?

This 32 message thread spans 2 pages: 32 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Professional Webmaster Business Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved