| 8:10 pm on Jul 14, 2008 (gmt 0)|
|mysql with remote connections enabled. |
Don't do that. You'll be sorry (You'll find out why MySQL always likes using "localhost" as the server).
Personally, I don't like SOAP, but that may have more to do with my not encountering any decent implementations.
I like plain old XML. In fact, I'm designing a system right now that will not only emit XML, but XHTML, so it can be embedded directly into pages. Might be a bit "wordier" than custom XML (Which I have used in the past), but it will be a thousand times more useful. That's important, because it will be an API.
| 8:46 pm on Jul 14, 2008 (gmt 0)|
It depends. XMLrpc shouldn't really come up as an option...
Which language do you plan to build the webservice in?
SOAP is not really worth to talk about in php and providing a service can mean some hard work in perl.
REST is fairly easy to do in any language, and you can define the output-format however you want to.
I also second cmarshall's advise, never give direct access to your database.
| 8:55 pm on Jul 14, 2008 (gmt 0)|
I am biased.
I am a very big fan of REST.
So even when SOAP is a feasible option, I rarely choose it. Ever.
The only time I use SOAP is when a web service provider forces me to, and that's with quite a lot of grumbling, arm-twisting and complaining.
(ducks to avoid flying bars of SOAP)
I have yet to find a web service that can't be done with REST, and it's the easiest to accomplish in PHP; REST also plays nicely with AJAX, cURL, and other URL-requesting mechanisms.
If this is a vote, I'm putting in 1 vote for REST
| 9:06 pm on Jul 14, 2008 (gmt 0)|
I am most comfortable with PHP, but will learn anything! Priority here is ease of use on the client side.
At the most basic level I want to execute queries on my mysql tables and return them as results that clients can list in their static or dynamic pages.
I also want to let users insert data along with a keyword. After Inserted data is processed the result can be retrieved using that keyword.
Can I accomplish this all with REST? Is it fast? Or would I need to implement some sort of cache?
| 9:08 pm on Jul 14, 2008 (gmt 0)|
oops i meant would clients requesting info need some sort of cache.
| 2:16 pm on Jul 15, 2008 (gmt 0)|
Well, I read up on REST last night, it looks good for sharing results from a single query, but not multiple queries.
The best idea I could come up with is something similar to Pay Pal's Instant Payment Notification. Where my partner websites submit the information they want processed and then my server sends the processed data to their php page for database insertion.
I guess, I'll start coding this soon... Unless somebody knows of an already made script I can modify that does this?
| 2:45 pm on Jul 15, 2008 (gmt 0)|
remember, REST uses the same protocols as HTTP, so you can make requests using headers GET, POST, PUT, etc.
When your client has a simple request with minimal parameters, your REST service should accept it with GET
When your client is sending piles of information for processing, your REST service should accept it as POST.
| 4:47 pm on Jul 15, 2008 (gmt 0)|
Yeah, but all of those http requests take a while to process. I don't think my partner sites would want to wait so long for their pages to load while these requests were being executed. I think a better alternative would be to let my partner sites have a database tables of their own that I can write to. So, basically I want to make sure that records in their table are synchronized with the information in my table.
Is this what they call duplication?
| 11:47 pm on Jul 15, 2008 (gmt 0)|
duplication it is, but it's more like replication in slang, at least if you really want to share your database and you can trust those sites.
on the other hand, depending on how often the content changes, you could still do it with REST.
e.g. your partner would send a HTTP-request like
and you'd simply reply with plain text, giving the url. there's virtually no protocl overhead as it would be in soap, he doesn't have to parse xml to get the info etc and you don't have to hassle with synchronizing databases, especially if you don't host in the same rack or at least datacenter and your database might change often.
remember, amazon's affiliate system works this way (only with much more overhead) and I've built quite a few sites with it. If done correctly, you'll hardly notice that the content is not coming from a local database.
| 12:57 am on Jul 16, 2008 (gmt 0)|
worried about performance? you could employ caching [en.wikipedia.org]...
But if you really want to make your clients do all the work, then by all means export your tables as CSV and put them somewhere for download, and refresh those copies on a schedule or CRON. Some affiliate programs expose huge product catalogs that way, usually in a secure folder that requires authentication.
| 2:56 pm on Jul 16, 2008 (gmt 0)|
So, it's between these two approaches:
Whenever I update my tables, I also POST the info to an affiliate site. Their script will update their tables and report back to me. Sync accomplished.
Affiliate sends a GET message to my server asking for all the different URLs they need. I POST back the information. No sync needed.
Which method is easier for my affiliate to implement? Is it difficult to retrieve this REST info and parse it?
| 3:11 pm on Jul 16, 2008 (gmt 0)|
I'd say they're both easy to build but REST is more stable and easier to maintain. let's say your database grows and you have more changes than you do now. and let's add a few more affiliates. if you want them to be able to be up-to-date, you'd have a steady stream of updates going to each and every one of them -- even though they might not need it. in the REST-case, they'll only request those queries they need, and with a short caching on their side to speed things up, it's gonna work just fine.
and btw, just to be clear, you don't POST back information, you send them in the answer to the GET-request, so there's just one request, not two.
the body of the REST-message is up to you. you can send plain text, csv-data, XML, json or whatever you can think off. I'd choose a format that will get you to a good place between flexibility and easy parsing. XML ist very flexible but parsing it can be a painful mess for someone who's not that fluent in php. plain text, for example one url per line, is very easy to parse but you're locked into that format once you deploy, otherwise you might break your affiliate's scripts.
| 3:26 pm on Jul 16, 2008 (gmt 0)|
Guess I'm gonna have to start reading this book on REST tonight. You say I should implement this using SOAP as well?
| 5:30 pm on Jul 16, 2008 (gmt 0)|
Don't bother with the SOAP!
And I don't like the POST idea either. Too much maintenance, on both ends of the transaction. Not only do you have to maintain a list of which URLs to POST the data to, but they also have to have a handler on their end that receives the request and does something with the POST. And what if their server is down when your POST arrives? They'll reboot, then have to wait until you POST it again. Bah. Though I'll admit it's brilliant that you're examining every possible angle here, that's not a road worth following.
Though it is worthwhile to crack a book on REST to get thoroughly familiar with the philosophy, tautology, phrenology and dendrology of REST, REST is deceptively simple and can be explained in a pamphlet. It's based on normal, every day HTTP.
A REST client can be written in just a few lines.
$param = $_GET['incomingparameter'];
$output = get_the_data_they_need($param);
Of course that second line there is overly simplified...
| 9:48 pm on Jul 16, 2008 (gmt 0)|
hmmm...It seems the biggest quandry I'll have is deciding how to output the data. If I'm just serving URLs maybe just use key=value pairs?
i dont' know much about xml, but it seems a lot more flexible, expandable, and descriptive. I guess I can figure it out and code up a script for affiliates, which parses it. Is that common practice?
| 9:54 pm on Jul 16, 2008 (gmt 0)|
not common, but good. most sites that offer affiliate programs won't provide a working script but just documentation. you'll make your affiliates happy if you give them something they can start expanding. Look out for code quality, it would be very bad for their trust in you if your script contained any security issues.
| 12:32 am on Jul 17, 2008 (gmt 0)|
If you offer it as XML, it is easily digestable by every imaginable platform. It's iterable (with XPath), compressable (with GZip), transformable (with XSLT).
If it was me I'd consider offering the feed in XML, JSON, and CSV. Once you have the mechanism for getting the data, rendering it as each is really simple.
All it takes is an extra parameter on the query:
| 1:39 am on Jul 18, 2008 (gmt 0)|
you can use php or c# create a webservice,it is simple.
[edited by: tedster at 5:34 am (utc) on July 18, 2008]
| 1:00 pm on Jul 18, 2008 (gmt 0)|
That's what everybody tells me. I wonder why all these web services books are so thick!
| 1:59 am on Jul 20, 2008 (gmt 0)|
I can't believe everyone is telling me how simple this is. I am reading a book on using xml for webservices. It is overwhelming and uninteresting...all this crap about DTDs, etc...
| 2:15 am on Jul 20, 2008 (gmt 0)|
|It is overwhelming and uninteresting...all this crap about DTDs, etc... |
Welcome to the world of XML. Compared to a great many programming languages, including HTML and CSS, it is not so bad.
In general, it's a lot easier to create XML than to consume it. However, there is a lot of support at the language level for it. Most languages have built-in XML parsers, and that helps a lot.
If you are serious about doing webservices, then I'd highly recommend getting familiar with XML.
The biggest problem with XML is that there is both too much information about it, and not enough. I am constantly running into this.
If you google "Learn XML", you will get a lot of hits. However, upon closer examination, you'll find that 90% of them are the same few articles, rebranded and reposted. In some cases, they are slightly rewritten, but the essentials remain the same.
The people who deal with XML all the time are standard geeks. They have a difficult time returning to their roots, and all their writing starts at the halfway point. This is what makes technical writers who can present information at its most fundamental level so valuable.
| 8:43 pm on Jul 20, 2008 (gmt 0)|
Until you need XML validation within an application (a rarety), you can skip the entire chapter on DTDs. I've been making web services and XML-based apps for many years, and I've only authored a DTD twice. The first time was because I wasn't using the technology right (I needed to redefine the IDENTITY attribute used in getElementById() method), the second time was really just to be anally thorough on a hobby project, where I decided to offer my XML with a DTD, an XSLT stylesheet with CSS, a Schema, and all the other optional trappings.
DTDs do come in handy, sometimes. Eventually you might need to learn it. But on a simple web service, and your first one? skip it.
| 2:43 am on Jul 21, 2008 (gmt 0)|
cmarshall, i couldn't disagree with you more! I have never started to learn a programming language and been so turned off and found it so difficult. It may be simple, but completely uninteresting. I could spend days on php, just b/c its so much fun to learn, no matter the challenge.
I don't really like to use a technology without some level of expertise. I guess I'm a real geek that way. but, as you say httpwebwitch for what i'm doing as a first timer it makes sense to take the easy way out.
One thing i am concerned about though is errors. What if my server is unavailable and cant deliver or sends back an error message where my content should. Is that something the client checks for when parsing the xml i deliver?
| 7:14 am on Jul 21, 2008 (gmt 0)|
yes, error-checking has to be done on clientside. Of course, you should include some status-element if you can have multiple error-sources (such as "sorry, you're not allowed", "general error", "empty keyword"), but checking if anything came back at all is definetly the client's job -- if there's a network problem, how could your server know, you wouldn't even get the requests in the first place.
| 2:19 pm on Jul 23, 2008 (gmt 0)|
yeah, I'll let the client deal with parsing and error checking for now. maybe when i get some time ill write some example scripts they can use.