Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: httpwebwitch
I've just started looking into this. I get how to create an xml document, schema, stylesheet,etc. What I don't understand is what happens next - ie how the content would then be delivered to a variety of other websites.
* Would I have to use a special xml server?
* Do outside hosting companies even have such things?
* Is there an xml server for dummies tutorial somewhere?
* How do other websites know how to display the content?
I've tried googling this and can only find sites that are WAY beyond my comprehension. I really can't run my own server. I use a hosting company now. Any insight is greatly appreciated.
You don't need a special kind of server!
You've created an HTML website, right... and you sent your HTML files to the server, probably via FTP?
Put your XML files on the server just like that. Then let your syndicates know where to find them. There's no magic, no extra software, nothing complicated. They're just files that end with *.xml
The sites who want to use your content will request those XML files, and your server will deliver them, just like it does with the HTMLs and GIfs and CSSs etc.
Now, XML content delivery *can* be much more complicated than that - many sites these days use a Content Management System (CMS) which uses server-side scripting to generate HTML, and many will also deliver the same content in XML. And then there are XML Web Services, which offer specially-packaged data in XML format; then you've got your "feeds", where the same URL will offer XML that changes frequently. RSS is an example of a feed.
But at its most basic, you can syndicate your content just by creating some XML files in Notepad, and uploading them to the server alongside your HTML files.
You might consider putting the XML files beside the HTML ones, in the same folders. Then for each article in HTML, you'll have one in XML as well:
see what I mean?
But I guess where I'm confused is - how would another server display the files?
To use your example, if there's a file called top-10-products.xml on my server, how would someone else's server create a page that had that data in it? (In googling, I've come accross terms like parse and query, but I'm clueless as to what that even means)
parse = "pick it apart and use bits" - it's like "dissect" and "interpret". If you parse a page of text, you are identifying words on the page you recognize. If you parse your laundry, you're finding the socks. When you parse XML, it's reading the document, in order to make use of parts of it.
here's a hypothetical example
You're a journalist for XYZ.com, creating a new editorial story every few days.
You tell your syndicates that the latest story will be published at XYZ.com/feed/xyz.xml
Let's say you have a syndicate, named ABC.com
They want to republish the latest XYZ story on their site, and they want it as an XML feed.
Your job is easy - you just have to put XML up on your site. ABC has the hard part - they would need a server-side script that knows how to query and parse XML. That means:
1) someone visits the ABC.com home page,
2) their browser sends an HTTP request to server ABC
3) the request at ABC triggers the execution of some software
4) ABC server makes an HTTP request to XYZ.com
5) XYZ responds with XML
6) ABC's server parses the XML, to get the article title, subtitle, body, footnotes, etc
7) ABC inserts content into an HTML page template
8) ABC sends HTML as an HTTP response back to the user, where the browser renders it on their screen
So. That way if you change the content of your XML, it will automatically show on on their site. No maintenance required. That's a very typical scenario, illustrating the beauty of XML content syndication.
Thankfully with modern programming languages like PHP, Perl, Python, C#, etc this whole scenario can be written in about 3 lines of code, no one needs to tinker with TCP/IP, and accidental electrocutions are rare.
Here's one of my favourite examples:
an XML file with XSLT and CSS:
an XML file with no XSLT or CSS:
and question two:
I would give the syndicate a list. Or you can even create an HTML index page with links to each of them
it's really hard to give good advice knowing so little about your particular site, but these are tactics that are generally good in practice
2) your only concern should be: don't put any secret or sensitive data in the XML. True story: I once consulted for a company that was delivering an XML feed. Sadly, some smart developer left the script in "debug mode", which exposed procedural tracing messages, and more data than they should have -- including environment variables, admin messages, and even a database connection string containing their root DBA password. Not only did it make the XML unusable, but it presented a real security concern. Luckily the problem was discovered before someone exploited it, and no harm was done. But it could have been disastrous.