Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: httpwebwitch
I use Dbtools pro to manage my databases.
The trouble I am having is that the XML file contains multiple nested elements and the import wizard in dbtools won't import the sub elements.
What is the easiest way to get the xml data into a mysql database?
I will have a look at XQuery to see if i don't actually need to go down the mySQL route at all, but i'm worried that working with a 20mb xml file on a live website will simply be too slow.
My experience is that modern XSLT processors are pretty fast, but 20MB is a lot of data. You may want to consider "tweening" with PHP or something.
Which reminds me, PHP (and most server-based languages) tends to have some pretty robust XML parsers. You may not need to use XSLT at all.
My [mis]understanding of XQuery is that it is meant to REPLACE SQL. But maybe it is just a layer on top of SQL.
I will certainly be looking at XQuery for some of my smaller XML duties though, it looks pretty simple.
I have had some other advice in the last hour that using PEAR to unserialze the XML file will help get it in a format to import into a database but I'm not very proficient with php, and the site is pretty unfriendly to noobs :)
You may want to consider "tweening" with PHP or something.
I had thought of that too, but i need to update the xml data weekly, and so was hoping to set up a cron job to import the data to my database automatically once a week.
Looks like some kind of php script will be the way to go. better get my learning cap on i think :)