homepage Welcome to WebmasterWorld Guest from 54.205.144.54
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Code, Content, and Presentation / HTML
Forum Library, Charter, Moderators: incrediBILL

HTML Forum

This 73 message thread spans 3 pages: < < 73 ( 1 2 [3]     
XHTML -- is now the time?
tedster




msg:580657
 6:20 am on Mar 6, 2001 (gmt 0)

As of now, I haven't started writing strict HTML 4.0 , but I know that over a year ago, the W3C recommendation [w3.org] already moved on to XHTML 1.0.

So practically, what does this mean? From what I've learned so far it looks like the jump from transitional 4.0 to strict 4.0 is a much bigger jump than going from strict 4.0 to XHTML.

Would it make more sense to go straight to XHTML? Looks like it's just a matter of writing strict HTML 4 and adding on a small handful of extra rules. Things like closing every element, for instance -- writing <br></br> or <br /> instead of <br> seems a little odd, but it's no biggie. Well, maybe I'll need to update my editor first.

By the way, I found a very simple tutorial [w3schools.com]. I can use the W3C for looking things up, but I have a lot of trouble learning something new there.

So what is everyone doing, or planning to do? Strict XHTML requires CSS -- so external stylesheets seems to be the first order of business. From there, it seems a relatively simple jump to strict XHTML 1.0. Am I right?

 

mattur




msg:580717
 9:37 am on Sep 6, 2002 (gmt 0)

the meta-utopia

Who thinks like that?

er... TBL, w3c, Zeldman, ...

And to be clear: XML is a great technology. But that does not justify the xhtml concept. Web pages are for people. SOAP interfaces are for machines ;)

Perhaps this divide is between relatively new folks who started writing "ie4 only" sites with dodgy markup and now, with a reformed-smoker's anti-tobacco-like zeal, are shouting "pages should work for everyone and tags should be closed - put it thru the validator", and the rest of us are saying "yes, actually we already knew that... but most important is that it works for your users without, say, needing javascript... " ? :)

A colleague of mine used to rant about using w3c valid html 2 (I think), back in the mid 90s. He always went on about how when standards-compliant browsers came through his pages would work and mine wouldn't. He put those little w3c html 2 valid tick buttons on his sites. He traded off working well back then for working well in the future. His pages are still all over the place in modern browsers. Mine still work without any problems. So I have to admit I am probably slightly over-sceptical about "future compatibility" claims... ;)

alex_h




msg:580718
 1:00 pm on Sep 6, 2002 (gmt 0)

XML is wonderful. Our core tech base is OO PHP 4.2 code (migrating from 4.0 to 4.2 was a bit painful, but worth it), sitting on top of a PostgreSQL database with several design constraints to work in our model. However, we use different front ends for different projects.

Our standard front end is HTML 4.01 transition with CSS1. We provide different stylesheets for different browsers, sometimes modified based upon cookie, etc., we do a LOT in CSS. The PHP scripts generate those pages.

Our administrative tool's 1.x version was HTML + Flash 5. The Flash 5 menu was drawn on the fly from an XML output (from the PHP 4.0 toolkit). Our newed admin tool, 2.x, is Java2 1.4 on the client, but the server portion is still in PHP. The Admin program is an XML application written in PHP 4.2.

Without XML, we'd lose our mind. For a new project, we're contemplating exporting portions as web services through SOAP. XML is AWESOME.

However, for web pages, XML/XHTML is pointless. The well-formedness, well I do it for my code, but I need to smack some of our developers around. Those of us that learned HTML years ago need to get out of the "tags should be in all-caps" habit that changed. Well-formed XHTML is easier for a browser to parse than dodgy-HTML, which is a good thing. You are likely to have fewer problems with minor browser revisions.

However, don't tell us that XHTML is the future, it isn't. The innovations will all be on the server-side until NN4 dies (never?!?).

I worry less about the perfect appearance on NN4, but my UI NEEDS to work on NN4. Those of your that write off 5%-8% of your users? Well, that would cost us almost $100k/year, we can't do that.

Do what works for you, but writing off 8% of your users because of your standards-compliant Jihad is not a decision that I'll join. I work on robot-friendly sites because search engines help me make money. Throwing it away to be "compliant" is counter-productive.

Alex

bird




msg:580719
 1:38 pm on Sep 6, 2002 (gmt 0)

Do what works for you, but writing off 8% of your users because of your standards-compliant Jihad is not a decision that I'll join. I work on robot-friendly sites because search engines help me make money. Throwing it away to be "compliant" is counter-productive.

The idea that XHTML forces you to abandon old browsers is plain old FUD.

I am one of those with 8% NS4 visitors, and I serve them validating XHTML 1.0 without a problem. There are a few small visual differences, but those are neglectible. My visitors come for the information I offer, and don't really care whether the menu column uses a line spacing of 1.4 or 1.5 (if they even notice).

Dismissing XHTML 1.0 after your own site validates to HTML 4.01 is just silly. The two are essentially the same and can be converted back and forth automatically in most cases.

No current browser really supports XHTML 2.0, so there's no reason to get worked up about it yet. Yes, the development of new browser features is slowing down, and this is a good thing. But defining the standards now that future browsers are supposed to support in three or five years is definitively better than the situation we had in the past. Do your really want each vendor to invent their own "line" tag?

madcat




msg:580720
 2:25 pm on Sep 6, 2002 (gmt 0)

with a reformed-smoker's anti-tobacco-like zeal, are shouting "pages should work for everyone and tags should be closed

standards-compliant Jihad

So people are really just losing their minds moving towards a mirage, rather than trying to make it so your information can have a greater audience by separating structure from presentation.

chameleon




msg:580721
 3:36 pm on Sep 6, 2002 (gmt 0)

Wow! There is so much going on here, I don't know where to start or who to respond to first!

XHTML/XML are kinda pointless.

That might be one of the most ridiculous things I've heard someone say in a long time. That's like saying "Having everyone in England speak the same language is pointless."

XHTML forces you to use the proper "grammer" of HTML. If we could get everyone to use it, we might finally be able to get rid of all those maddening IE vs. Netscape compatablity issues.

Think of it this way:

Imaging how hard it must be for the folks at Microsoft, Netscape and Opera to develop their browsers knowing that 98% of the coders out there are using invalid HTML? They need to take into account the fact that some bonehead might put <b><i>Wow!</b></i> when they really mean <b><i>Wow!</i></b>. They've had to go out of their way to make sure that broken code gets displayed okay.

If everyone wrote XHTML, that wouldn't be valid. Think about how much more reliable and fast the browsers could be?

I think this dispute about XHTML is really caused by the fact that the purity of coding one can envision with XHTML and CSS is more promise than reality at this point.

Well, yes, that's true -- to a point. For most web sites you can probably everything need with XHTML and CSS today. Support for advanced functionality is spotty (and even a few "core" things in NN4), but unless we developers keep pushing those limits, what reason will Microsoft and Netscape have to add more support?

To support your point you add "and if Amazon used xhtml then everyone could programmatically steal all their content really easily!"

Maybe I'm a bit dense, but how does XHTML make stealing content really easy? XHTML tags look just like well written HTML tags. There's no context like you have with XML. How would parsing through all of that be any easier than parsing through standard HTML?

BTW, this is a great topic! Way to get everyone involved!

bird




msg:580722
 4:03 pm on Sep 6, 2002 (gmt 0)

To support your point you add "and if Amazon used xhtml then everyone could programmatically steal all their content really easily!"

Actually, that's exactly what Amazon wants you to do. They have even gone the extra mile to provide a pure XML feed [associates.amazon.com], in order to make it especially easy and painless for you to "steal their content".

There goes another "argument"... ;)

mivox




msg:580723
 6:49 pm on Sep 6, 2002 (gmt 0)

Do what works for you, but writing off 8% of your users because of your standards-compliant Jihad is not a decision that I'll join.

Perhaps you missed my post earlier about building a valid XHTML Transitional site that both functions and looks fine in NN4?

Saying that XHTML doesn't work in NN4 is just not true. You should quit trumpeting it as though it were.

PeterD




msg:580724
 8:32 pm on Sep 6, 2002 (gmt 0)

Great thread. I'm still trying to come to some sort of conclusion about XHTML. If the problem is that people write bad HTML, won't they just write XHTML that is just as bad? If it's that the strictness of XHTML simply doesn't allow bad coding (which I would find surprising--people are able to find bad ways to do anything) won't a lot of less-motivated people just stick with something very forgiving, like HTML 4.01 transitional? (I assume we'll never see the day when people can't look at pages written in HTML4 on the desktop--even if they won't be able to on web-enabled vacuum cleaners!)

madcat




msg:580725
 8:54 pm on Sep 6, 2002 (gmt 0)

Yeah...there will be different camps always. Just pick the right method for you and don't worry about anyone else. Try to know the past so you can make intelligent choices for the future- that's nothing new, nobody can win this argument today.

and on and on...

[edited by: madcat at 11:20 pm (utc) on Sep. 6, 2002]

Pushycat




msg:580726
 9:58 pm on Sep 6, 2002 (gmt 0)

FWIW, I just finished converting a 200+ page website to XHTML 1.0 Transitional. What I found was that it's just as easy to write bad code in XHTML as it is in HTML. What makes ALL the difference is validation. After doing the conversion AND validating the pages I now have 100% valid XHTML pages that now also work in NN4.

Aside from the personal satisfaction it gave me it also gives me peace of mind knowing that my pages should work well and look nice regardless of what browser they're using.

If you're looking for some more tangible benefit, I anticipate having to spend less time dealing with website issues because now I am confident that whatever the problem is it can't be my code so I can just tell the complainer that it must be their browser and be done with it. :)

gph




msg:580727
 11:00 pm on Sep 6, 2002 (gmt 0)

The separation of content from display is a meaningless point.

The content is separated, itís in the database.

Iím a novice and know nothing of databases but it seems to me that content can only be separate. No matter the vehicle, if itís different, it can't be housed or altered globally.

Separating display from content has to be a step forward for any site with similar pages.

Regarding backward compatibility, as a novice Iíve found it ďrelativelyĒ simple to create XHTML 1.0 strict pages that look and work fine in 4x browsers using CSS to alter layout. Just looking at what was used to create pages in the past and what can be done now was enough to convince me, not to mention the relative assurance of forward compatibility.

mattur




msg:580728
 12:40 am on Sep 7, 2002 (gmt 0)

how does XHTML make stealing content really easy

xhtml is html reformulated as xml. xml and hence xhtml can be easily retrieved and processed using an xml parser such as MS' msxml4.0. Or any other parser. You can do the same with plain old html (screen scraping), xhtml just makes it a bit easier.

That's why the w3c have put an "x" in front of "html". The idea is that the semantic web will have all these machines semi-autonomously pulling x(ht)ml files and processing them without human intervention, using the semantic hints in the markup. Thats why the limitations inherent in shared meta information spaces are relevant. Thats why the w3c have reformulated html as xhtml - xhtml is the "bridge" between html and the proposed semantic web's xml.

They have even gone the extra mile to provide a pure XML feed

Thats right. As I said above, web pages for humans, SOAP for machines. We don't need xhtml. SOAP/xml-rpc are ideal solutions for this kind of thing.

I s'pose if you are unfamiliar with xml then xhtml is a straightforward introduction. If you are a developer then you're probably already familiar with xml, because it is everywhere - already.

I think the w3c are blinkered - AIUI they're attempting to retrofit the existing dominant technology for new purposes. I want to see the Next Big Thing that replaces the web. It'll probably have it's own protocol, and the clients will still support/subsume http, ftp, gopher etc in the same way that browsers subsumed (sort of) gopher and ftp. But it'll be something new. If you're struggling to get folks to upgrade their browser, why not instead get them to install a new client that does something new when they start it?

I'm doing my bit to be The One who thinks up the Next Big Thing: I've assembled some resources including several PCs, a LAN, an internet connection, literally dozens of O'Reilly books, a copy of How Buildings Learn, The Matrix (Collector's Edition) on dvd and a pretty-damn-big widescreen Sony telly. I'll keep you all updated on my progress. So far, all I've done is watch The Matrix. 137 times. So far. But I'm hoping for a major breakthrough any day now... in the meantime my money's on Dave Winer coming up with something ;)

copongcopong




msg:580729
 4:54 am on Sep 7, 2002 (gmt 0)

This was my former post on another thread here ...

it is either your part of the solution (and be with the standards) or ... part of the problem (and that is why browsers still have their quirk modes).

quirk modes of browser still and would remains for bad coded sites ...

This is a battle of coders who believe that in reality standards would not fit all their target audience due to old browser and browser different implementation of the standards .... vs the coders who believe in the standards where in reality it is not possible to achieve 100%, which I personally believe.

Well I am part of those coders (dreamers) who believe in the standards ... why?!?

it would be better to have a common goal and hit it partially than to have no goal at all and code for specific browser.

And before I forget, ... YES Its time for XHTML ... it is not tomorrow we do not need for the browsers to catch up with the code .. they know their faults and they should be working on it!

peace. :)

This 73 message thread spans 3 pages: < < 73 ( 1 2 [3]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / HTML
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved