Forum Moderators: coopster

Message Too Old, No Replies

php optimize query and JOINS

         

hozyali

7:14 pm on Aug 6, 2010 (gmt 0)

10+ Year Member



Hello,

I wanted to know what could be the best optimized query for my case.

I have a page which lists events. The events table actually has several INT columns which have IDs for other connecting tables.

Here is the example

Events listing loop from TblEvent WHERE categoryID = 25;
Now the loop has run and gives me the list of events
In this recordset, I get ArtistID which gets in TblArtist table to get its full info.

Then the events listing also has VenueID which goes to TblVenues table to get the full details

Likewise, TblVenue provides venues name and IDs for city, state and country to get their details from their respective table.

So you see, there are many tables being involved to get all the data for the event listing.

Currently I am just doing simple query in the loop, like I open each table every time in the loop to get their details. But I need to know if there is optimized query can be made either using Left joins or something else.

Kindly advise.

Thanks a lot

enigma1

7:02 pm on Aug 8, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Depends on how the script utilizes the queries. And what options are there for the user to use.

So if you have categories and various events in each category and you want to list events of a specific category, you do the primary query to the events table only, with the category id. Then for each row you do a secondary query to get other details for each event those reference in other tables. This can be extremely efficient with a huge number of records because you will have to have pagination for the listings meaning there will only be a limited number of rows pulled each time via the primary query enough to populate the page the user sees.

joining multiple tables with a huge number of rows in the primary query will generate latencies and may bring down the server, even if you use pagination. Keep the queries simple, unless the tables contain very few records.

(assuming the default mysql)

rocknbil

1:53 am on Aug 9, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Left joins will return rows if there are matching records or not (null) which is often useful. But in your case, it sounds like the events will always have matching records in the joined tables (will always have a city, etc.,) so you can do this.

Try **not** to use select *, be specific and select what you need. Let's say it's not a specific id, it's a printout of events within a specific date range . . .

$start_page = (isset($_GET['start']))?$_GET['start']:0; // EXAMPLE ONLY, cleanse that input . .
$per_page = 25; // for example

select TblEvent.EventID, TblEvent.name, TblEvent.start_datetime, TblEvent.end_datetime, TblArtist.name, TblArtist.whatever, TblVenues.address, TblVenues.city, TblVenues.state, TblVenues.country, TblVenues.description from TblEvent, TblArtist, TblVenues where TblEvent.ArtistID=TblArtist.ArtistID and TblEvent.VenueID=TblVenues.VenueID and TblEvent.start_datetime >= '2010-08-08 00:00:00' and TblEvent.end_datetime <= '2010-08-31 11:59:59' order by TblEvent.start_datetime asc, TblEvent.name.asc, TblArtist.name asc limit $start_page, $per_page;

May contain errors, typed on the fly, but that's the idea. As long as your joind table have a unique join id on the main query, this will (almost) always work. The limit would be of course dynamic based on pagination, and assuming the start and end dates are assumint datetime types. For specific entries, same thing, just change the date where parts to TblEvent.EventID=$event_id.

More efficient? Probably, it's one select. Heavier on your server? In large scale, maybe, maybe not, small scale (under 1 million records, not a heavily visited server) should be fine. Do benchmarking against both methods.