tec4 - 3:24 pm on Jul 18, 2012 (gmt 0)
Lots of data...each source has between 1-5 tables that I am trying to import to my local "master table" to hold all the data. The sources are product feeds coming from multiple vendors. The products themselves are all similar, but the the naming of table column names from each feed are all named differently and sometimes the content within the data field is different as well. This is strictly a server utility pulling from normalized data from each vendor. What I am pulling is never changed on my end from users - just needs to be updated from the vendors feeds through an update script.
Each table I am pulling from has over 100 columns. I am not pulling each of these, however, but instead am trying to see which external column has the same content as the local ones I am trying to map it to and do that instead.
The other solution I thought of yesterday:
Was thinking it may be easier to work with, and handle the data, if I had the entire external tables stored locally in MySQL (a simple script to keep my local table an exact match of the external one at certain points in time and keeping their naming conventions and such) and then use a separate script to update from these tables into my Master Table. Would that be easier to handle and the data you think?