wrightee - 2:08 am on Nov 3, 2011 (gmt 0)
I need to log about 50K rows per day of data extracted as a CSV from adwords on a daily basis. Since most of it is repetitive keyword / campaign it's prime for normalisation.
I can find loads of posts about design for normalisation, but little about the best way to process your data for an insert.
If I have, say:
tbl_keywords: id,keyword - table of unique keywords
tbl_campaigns: id,campaign - table of unique campaign ids
.. and I need to process a new 50K rows of data, creating new keywords and campaigns in tbl_keywords, tbl_campaigns where appropriate and inserting the relevant keys into tbl_performance - what's the best way to do it?
Thanks in advance for your advice.