|XML on the Web Has Failed|
Article on xml.com
| 1:46 pm on Aug 7, 2004 (gmt 0)|
|Syndicated feeds are wildly popular, but they're not a success for XML. XML on the Web has failed: miserably, utterly, and completely. |
I read this article with interest - basically, it says that XML on the web doesn't work because of character encoding issues with serving XML over HTTP, and that as most XML content currently available client-side (in particular, RSS feeds) are usually invalid, and the tools don't follow the XML specs in terms of error handling. Of course, it doesn't go into server-side use of XML subsequently transformed into another format (eg. HTML) for the end-user.
So do you agree with the sentiment, or is XML really a failure on the web?
| 5:33 pm on Aug 7, 2004 (gmt 0)|
Well I look at it like this. True, when you mention xml over http, it does have some issues. Especially if you use asp, then when you load the return xml into your domdocument or your freethreadeddomdocument, it will be forced into UTF-16.
Also, there are so many xml feeds out there that aren't validated against a DTD or Schema.
Now, xml/xsl client-side, well let's not get into that. That is utter faliure right there.
I do believe that XML has made great improvements to online systems such as travel. Especially in the realm of affiliate marketing, if you have a good xml feed, then things are great.
The beauty of it is, I can have an application that is built in Java, but then have another program interface with it.
XML is portable across systems. It's great!
| 10:45 pm on Aug 7, 2004 (gmt 0)|
You're certainly right about the failure of XML/XST on the client side - common sense killed that right off. Also, IE's continued lack of support for
application/xhtml+xml is helping to kill XML-style XHTML too. The latter is probably a blessing in disguise because of the same reasons XML/XSL is failing now - it's the wrong cure for a non-existent problem (draconian failure-handling is crazy client-side - one false move and your page is completely inaccessible).
However, I found the total rejection of XML on the web in the article too generalistic - as you said, XML is great server-side and machine-to-machine as a cross-platform solution to data sharing and transfer.
| 12:52 am on Aug 8, 2004 (gmt 0)|
|(draconian failure-handling is crazy client-side - one false move and your page is completely inaccessible) |
I disagree, strongly. Life would be so much simpler if people expected pages which have illegal syntax (i.e. things not properly closed or closed in the wrong order) to not render. I attribute the need to deal with buggy code as being one of the major contributing factors to browsers getting bloated, more resource intensive, and downright buggy. There should be no need to deal with ambiguity.
| 7:50 pm on Aug 9, 2004 (gmt 0)|
Wouldn't the world be a happier place if everyone adhered to every standard set and standards didn't say silly things like "clients will NEVER send **** requests, but you must support them just in case they do" - I certainly think it would be better! :)
Change the old saying from "be strict in what you send and relaxed in what you receive" to "be strict in what you send and strict in what you receive"
| 8:09 pm on Aug 9, 2004 (gmt 0)|
I don't agree. XML has been wildly successful for us...however you should see some of the garbage that people try to pass inside of an XML feed. Not just occasional errors but also due to attempted html formatting in the feeds and this is from some well known sources.. it's horrible!
We now automatically normalize all incoming feeds and to strip out the garbage and then we re normalize on our feeds just in case our systems miss something.
[edited by: tedster at 9:03 pm (utc) on Aug. 10, 2004]
| 9:19 pm on Aug 9, 2004 (gmt 0)|
|There should be no need to deal with ambiguity. |
I recall an interview with an IE developer that covered this issue. From the browser developers point of view; you have to deal with ambiguity if you want your browser to either gain or maintain significant market share.
Your "average joe" knows absolutely nothing about what goes on behind the scenes when they type in a website address. If a web page does not display correctly they blame the browser in exactly the same way that they blame their television set when they can't get a picture.
| 12:30 pm on Aug 10, 2004 (gmt 0)|
AmeriClicks, you are right about news feeds, but your example is the precise case as to why the article author considers that XML has failed - according to the XML specification, a malformed or invalid feed should simply not display at all: as the spec concerning error-handling says that as soon as the XML is recognized as invalid, the entire contents should be replaced by an error message.
You are using XML, but not treating it as such - a true case of ideals versus the real world. In a technical sense, you are wrong to parse invalid feeds (because it is against the specification), but in reality, you are working around the errors and displaying the results anyway (you've got a business to run, after all).
This joins dmorison's comment from the IE developer: XML was supposed to force well-formedness everywhere because HTML was seen as encouraging invalid markup - but in reality, "tag soup" XML is thriving.
The IE developer is right - if XML client-side were to take off, a browser which broke the specification and displayed malformed XML anyway would take the lions share of the market, because in the real world coding errors can happen even when you are trying to keep things valid, and users just want the information. Those who consider that 100% validation all the time is possible don't tend to have any experience of running a site with third-party markup (from banner ads, for example), user comments (causing charachter encoding errors) or any kind of interactivity.
| 7:23 pm on Aug 10, 2004 (gmt 0)|
I really don't see a future for client side xml really. Like you stated that you have to deal with thrid party crap, which really is bad. The future that I see for XML on the web is use in affiliate programs and feeds for content sites. At least these are pretty much solid.
As affiliates grow larger and technically smarter, they are going to request such technologies to get out of the cookie cutter sites that plague many search engines. (They either learn how to do it; bad for me, or hire someone that can create this for them.)
Now, I do know of a site (before it was bought out...) that was great. It actually used xsl for the whole site, client side. I don't know how they accomplished it, but it sure worked.
| 3:46 am on Aug 15, 2004 (gmt 0)|
|However, I found the total rejection of XML on the web in the article too generalistic - as you said, XML is great server-side and machine-to-machine as a cross-platform solution to data sharing and transfer. |
I heartily agree with this statement. Machine-to-machine communications can and should be expected to follow standards and validate properly. XML is an excellent format for this type of communication. I don't see it being dropped for another format anytime soon.