Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: httpwebwitch
This would be a simple service that returns a relevant URL to a client's query. If no URL is found, the query is sent for further processing.
Thx for any advice!
mysql with remote connections enabled.
Don't do that. You'll be sorry (You'll find out why MySQL always likes using "localhost" as the server).
Personally, I don't like SOAP, but that may have more to do with my not encountering any decent implementations.
I like plain old XML. In fact, I'm designing a system right now that will not only emit XML, but XHTML, so it can be embedded directly into pages. Might be a bit "wordier" than custom XML (Which I have used in the past), but it will be a thousand times more useful. That's important, because it will be an API.
I also second cmarshall's advise, never give direct access to your database.
(ducks to avoid flying bars of SOAP)
I have yet to find a web service that can't be done with REST, and it's the easiest to accomplish in PHP; REST also plays nicely with AJAX, cURL, and other URL-requesting mechanisms.
If this is a vote, I'm putting in 1 vote for REST
At the most basic level I want to execute queries on my mysql tables and return them as results that clients can list in their static or dynamic pages.
I also want to let users insert data along with a keyword. After Inserted data is processed the result can be retrieved using that keyword.
Can I accomplish this all with REST? Is it fast? Or would I need to implement some sort of cache?
The best idea I could come up with is something similar to Pay Pal's Instant Payment Notification. Where my partner websites submit the information they want processed and then my server sends the processed data to their php page for database insertion.
I guess, I'll start coding this soon... Unless somebody knows of an already made script I can modify that does this?
When your client has a simple request with minimal parameters, your REST service should accept it with GET
When your client is sending piles of information for processing, your REST service should accept it as POST.
Is this what they call duplication?
on the other hand, depending on how often the content changes, you could still do it with REST.
e.g. your partner would send a HTTP-request like
and you'd simply reply with plain text, giving the url. there's virtually no protocl overhead as it would be in soap, he doesn't have to parse xml to get the info etc and you don't have to hassle with synchronizing databases, especially if you don't host in the same rack or at least datacenter and your database might change often.
remember, amazon's affiliate system works this way (only with much more overhead) and I've built quite a few sites with it. If done correctly, you'll hardly notice that the content is not coming from a local database.
But if you really want to make your clients do all the work, then by all means export your tables as CSV and put them somewhere for download, and refresh those copies on a schedule or CRON. Some affiliate programs expose huge product catalogs that way, usually in a secure folder that requires authentication.
Whenever I update my tables, I also POST the info to an affiliate site. Their script will update their tables and report back to me. Sync accomplished.
Affiliate sends a GET message to my server asking for all the different URLs they need. I POST back the information. No sync needed.
Which method is easier for my affiliate to implement? Is it difficult to retrieve this REST info and parse it?
the body of the REST-message is up to you. you can send plain text, csv-data, XML, json or whatever you can think off. I'd choose a format that will get you to a good place between flexibility and easy parsing. XML ist very flexible but parsing it can be a painful mess for someone who's not that fluent in php. plain text, for example one url per line, is very easy to parse but you're locked into that format once you deploy, otherwise you might break your affiliate's scripts.
And I don't like the POST idea either. Too much maintenance, on both ends of the transaction. Not only do you have to maintain a list of which URLs to POST the data to, but they also have to have a handler on their end that receives the request and does something with the POST. And what if their server is down when your POST arrives? They'll reboot, then have to wait until you POST it again. Bah. Though I'll admit it's brilliant that you're examining every possible angle here, that's not a road worth following.
Though it is worthwhile to crack a book on REST to get thoroughly familiar with the philosophy, tautology, phrenology and dendrology of REST, REST is deceptively simple and can be explained in a pamphlet. It's based on normal, every day HTTP.
A REST client can be written in just a few lines.
$param = $_GET['incomingparameter'];
$output = get_the_data_they_need($param);
Of course that second line there is overly simplified...
i dont' know much about xml, but it seems a lot more flexible, expandable, and descriptive. I guess I can figure it out and code up a script for affiliates, which parses it. Is that common practice?
If it was me I'd consider offering the feed in XML, JSON, and CSV. Once you have the mechanism for getting the data, rendering it as each is really simple.
All it takes is an extra parameter on the query:
It is overwhelming and uninteresting...all this crap about DTDs, etc...
Welcome to the world of XML. Compared to a great many programming languages, including HTML and CSS, it is not so bad.
In general, it's a lot easier to create XML than to consume it. However, there is a lot of support at the language level for it. Most languages have built-in XML parsers, and that helps a lot.
If you are serious about doing webservices, then I'd highly recommend getting familiar with XML.
The biggest problem with XML is that there is both too much information about it, and not enough. I am constantly running into this.
If you google "Learn XML", you will get a lot of hits. However, upon closer examination, you'll find that 90% of them are the same few articles, rebranded and reposted. In some cases, they are slightly rewritten, but the essentials remain the same.
The people who deal with XML all the time are standard geeks. They have a difficult time returning to their roots, and all their writing starts at the halfway point. This is what makes technical writers who can present information at its most fundamental level so valuable.
DTDs do come in handy, sometimes. Eventually you might need to learn it. But on a simple web service, and your first one? skip it.
I don't really like to use a technology without some level of expertise. I guess I'm a real geek that way. but, as you say httpwebwitch for what i'm doing as a first timer it makes sense to take the easy way out.
One thing i am concerned about though is errors. What if my server is unavailable and cant deliver or sends back an error message where my content should. Is that something the client checks for when parsing the xml i deliver?