Msg#: 3361521 posted 9:25 pm on Jun 7, 2007 (gmt 0)
I am working with a large XML file (20mb) and getting it to display on a website by using XSLT transformations takes a very long time for the server to process, so I have decided to put the content of the feed into a mysql database.
I use Dbtools pro to manage my databases.
The trouble I am having is that the XML file contains multiple nested elements and the import wizard in dbtools won't import the sub elements.
What is the easiest way to get the xml data into a mysql database?
Msg#: 3361521 posted 10:59 pm on Jun 7, 2007 (gmt 0)
My understanding of XQuery is that it is used to query XML files, much the way that SQL queries databases. So yes it replaces SQL, but only as long as the XML file replaces the database (which isnt what i want to do).
I will certainly be looking at XQuery for some of my smaller XML duties though, it looks pretty simple.
I have had some other advice in the last hour that using PEAR to unserialze the XML file will help get it in a format to import into a database but I'm not very proficient with php, and the site is pretty unfriendly to noobs :)
Msg#: 3361521 posted 12:23 am on Jun 8, 2007 (gmt 0)
I've found a technique that uses an XSLT transformation to restructure the XML data into a new XML file. This might get the data into a more streamlined format for me to import into a mySQL database. I will post my results when I try tomorrow.