|How to Build A Database|
Whilst not copying and pasting too much
| 7:46 pm on Mar 8, 2008 (gmt 0)|
I'm building a shopping service and plan to link to many different online merchants across typical categories such as apparel, activewear, jewelry, etc and so forth. I started collecting the links in list form,and rapidly saw I would need a spreadsheet. Now I realize that Im just going to have to put these HTMLlinks into a web page or SQL database. Is there some way for me to skip the spreadsheet and enter the links straight into the website? The site will have something of a CMS system, but does a CMS system allow a person to work with the links in a database-sort-of-way? I will need to manage and update the links and also tag them with various attributes, so I dont know if CMS's do this.
Does anybody have any suggestions what the easiest way is to pull this off? Im obviously not a very technical person :-)
| 5:13 pm on Mar 10, 2008 (gmt 0)|
By CMS I'm assuming you mean content-management system. The answer to your question all depends on how you have the system set up, that is, how elaborate your system is designed.
You could go through the backend and easily import your data to your tables depending on what kind of database you are using, for a better answer, I'll need some more information (e.g., what's your experience with databases, how does your CMS operate, is it a mySQL or SQL Server db, etc).
| 6:05 pm on Mar 10, 2008 (gmt 0)|
Thanks for the response. My site will be PHP and mySQL powered, but I frankly dont know what the CMS will look like. Perhaps I didnt think enough to get that specified by the developer. I think he'll be cobbling together some preexisting open source stuff for it. I suppose Im just trying to understand how other people do this. I know there are sites out there with many links, and I imagine managing and updating all those links takes precious time.
I should mention that Im talking about affiliate links for linking to e-retailers that come from affiliate networks. These networks have web services where you can use something like .Net to work with their API's to extract the links. Ulitmately, I hope to have a seamless, end to end system where I suck the links in through the web services, manage them (somehow) and spit them right into the website. Thats where my question is going - in terms of how others do this. Is there off the shelf software, for example?
Thanks guys !
| 5:12 am on Mar 11, 2008 (gmt 0)|
There's a number of ways to obtain links - one is by hand, but web crawlers/spiders with a logic unit to be selective are an easier way to gain a mass amount. This is similar to a search engine, but only for those sites that have certain keywords, or domain names, etc.
| 7:09 pm on Mar 12, 2008 (gmt 0)|
I'm only familiar with Amazon. Amazon has an API which you can query for product information. One of the fields is the URL for the product page.
Other large affiliate programs have similar APIs.
Of course, even that's not really necessary, as Amazon has a predictable scheme for constructing a URL. If you know the ASIN (essentially the ISBN number, at least for books) you can construct an Amazon URL.
It's useful, though, to have the merchant's database (or to be able to search it), or the parts of it that you are interested in for other purposes. You probably want to select products in certain categories, within certain price ranges, etc.