Welcome to WebmasterWorld Guest from 54.162.172.144

Forum Moderators: coopster & jatar k

Message Too Old, No Replies

Searching Large Table Count with PHP & MySQL

     
12:00 am on Apr 29, 2011 (gmt 0)

Junior Member

5+ Year Member

joined:Mar 20, 2008
posts: 172
votes: 0


Hi, I have a database which has about 200 tables in it, which need to be searched at once. Is there a best practice to search all these tables at once, other than writing an enormous UNION type query in MySQL?

PS: This is not a poor database design, just a large part of a very large whole, which cannot be separated.

Thanks in advance
1:50 pm on Apr 29, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member eelixduppy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 12, 2005
posts:5966
votes: 0


>> This is not a poor database design

You sure? ;)

Seems like you are going to need to setup some sort of aggregate read-only view/table that combines all of these tables into a data warehouse. Not sure if MySQL would be the best dbms to use for this much data, though. or if they have tools that would make this simpler.

But once you have the DW set up, you would just query against that.
2:14 pm on Apr 29, 2011 (gmt 0)

Moderator from GB 

WebmasterWorld Administrator brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 30, 2002
posts:4842
votes: 1


If the tables are MyISAM the query should be reasonably fast anyway as there is a count in the metadata for each table. Otherwise... it'll probably be slow any other way than eelixduppy suggests.
9:10 pm on Apr 29, 2011 (gmt 0)

Preferred Member

5+ Year Member

joined:Nov 20, 2007
posts:585
votes: 0


This sounds like something I would create a cache table for. Dump all the bits from the other tables that you want to search, then run queries on the cache table.
1:54 pm on Apr 30, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member eelixduppy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 12, 2005
posts:5966
votes: 0


Perhaps the merge storage engine could be useful? Although your tables have to be identical.

[dev.mysql.com...]
3:16 pm on May 2, 2011 (gmt 0)

Administrator

WebmasterWorld Administrator coopster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:July 31, 2003
posts:12541
votes: 1


Have a look at sphinx search. You can index your tables in the background with a crontab. You will only have to search the index from there forward. Lightning fast.
3:26 pm on May 2, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member eelixduppy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 12, 2005
posts:5966
votes: 0


>> index

Depends on the type of queries. If a majority of the data will be returned from a query, then using an index may actually slow performance.
5:17 pm on May 2, 2011 (gmt 0)

Administrator

WebmasterWorld Administrator coopster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:July 31, 2003
posts:12541
votes: 1


Data is always returned with a corresponding query when using sphinx but I was referring to the context of the OP:

about 200 tables in it, which need to be searched at once.
[...]
search all these tables at once


Once the index is found (assuming this is relational data) an exact query will be substantially faster.

Hopefully that made sense now.