Forum Moderators: coopster
I'm importing a data file with several thousand rows. To verify that the data is good, I'm matching a few columns to lookup tables in mysql. Now that's an awful lot of queries in a short time. If the lookup tables were hash tables that'd be fine. But they're MyISAM and aren't gonna change. It would be cool to run these lookup tables through a function that made them available to memory and then compare the column to the array in memory instead of hitting the db with all these queries.
So for example, all the lookup tables are pretty much like this and none of them are huge...max 30 records:
Table books
bookid¦bookname
===============
1¦hello world
2¦goodbye world
3¦wassup?
Table author
authorid¦author
===============
1¦Hesse
2¦Nietzsche
3¦Vonnegut
Then the data coming in would be for example:
junk¦junk¦junk¦hello world¦Vonnegut¦junk¦junk
junk¦junk¦junk¦wassup¦Nietzsche¦junk¦junk
junk¦junk¦junk¦goodbye world¦Hesse¦junk¦junk
And I just want to make sure the 4th and 5th fields had a match in the data...
Any idea how to do this?
However, if you're on shared hosting, you might not want to do that many queries. You can also use arrays for lookup. As an example,
$books['hello world'] = 1;
$authors['Vonnegut'] = 1;
Then to do a search:
if (array_key_exists("hello world", $books)) {
print "Book exists.\n";
}
if (array_key_exists("Vonnegut", $authors)) {
print "Author exists.\n";
}
My understanding is that any data the server reads will be stored in the cache in chronological order by last access. When the cached data isn't being accessed anymore, it'll be replaced by newer data when the cache is full.