|Webhost says my MySQL is bringing it to its knees.|
mysql select join
I have a home-made e-commerce system, and a backend system I've been using for years until recently where my host shut me down because they said the following was putting a drag on the shared servers.
name,address,city, ordernumber etc...
item#,description,price, ordernumber etc...
item, description, onhand, vendorid
And here's the command they say is wreaking havoc:
SELECT c.name,c.address,c.city,o.item,o.description,v.vendorname,i.onhand FROM customer_details as c, order_details as o, vendor as v, inventory as i WHERE c.ordernumber = o.ordernumber AND v.vendorid = i.vendorid AND i.item = o.item;
Note that there is a little more info being pulled out, but this is the basics.
1) Is it not written properly?
2) How can I pull out all this information in one shot in a more optimized way?
Have you indexed the databases? Even though it's a SQL server, if the databases aren't indexed, you're consuming a lot of processing time regenerating indexes.
Look up on mySQL how to do it if needed.
Indexing is a start - but we need more info, as often these are sometimes misguided (tech support.)
How many records? Shared or VPS? (Presuming one of those, as a host will seldom jump you for eating too much of your own resources.)
You have to consider that this may be more likely the cause:
|home-made e-commerce system, |
Seriously . . . I've seen a lot of horrible approaches. Most of the time it's related to how often and hard you're hammering the database, not the hammer itself.
A three table join shouldn't cause serious hiccups.
It's a four table join that joins every single row in all four tables. When the ecommerce system was first built and tested, that was maybe joining 3 customers x 3 orders x 3 vendors x 3 inventory rows? No big deal at that size. ;)
But after running for this long, you have a significantly greater amount of rows being joined together. Since you are joining everything to everything, you have an exponentional problem on your hands.
Find a way to limit the query, either by date or by a flag (completed) and you'll see it consume far less resources (and you'll get faster results). Make sure you put those limits first in your WHERE clause.
Well thanks to everyone.
1) It's a shared server (not VPS).
2) I know some of the fields I'm linking are indexed, but not all, so I should probably fix that issue.
3) The customer table has about 200 records, the order table about 800, the inventory 1500, and the vendor about 15.
4) There is a limit on that select statement I left out above for simplicity sake. This is a backend order filling routine. We run it once a day. It lists the open orders, then we one by one select the order to fill, and close out that order and the SELECT command runs again. About once every 5 minutes. It always pulls up virtually instantly.
So here's the command above with the limit (o.balance > 0):
SELECT c.name,c.address,c.city,o.item,o.description,v.vendorname,i.onhand FROM customer_details as c, order_details as o, vendor as v, inventory as i WHERE c.ordernumber = o.ordernumber AND v.vendorid = i.vendorid AND i.item = o.item AND o.balance > 0;
Let us know what the performance hit is when you've indexed the databases, and / or if you try to break the select statement up into individual selects.
Sounds like you are somehow getting a Cartesian product. You might try using the JOIN syntax:
FROM customer_details AS c
JOIN order_details AS o ON c.ordernumber = o.ordernumber
JOIN inventory AS i ON i.item = o.item
JOIN vendor AS v ON v.vendorid = i.vendorid
Are you sure that the relationship between customer_details and order_details is on ordernumber? Normally you would see something like customernumber be the link.
|Are you sure that the relationship between customer_details and order_details is on ordernumber? Normally you would see something like customernumber be the link. |
Yes, on our site we decided early on that we didn't want to force a user to "sign up" to place an order, so it was designed for buyers to enter their name and shipping info for every order. The customer info is tied to the order. We did a survey years ago for repeat customers, and the ability to have their info saved was lowest on the list. Maybe someday that will be changed.
By the way, our hosting company announced yesterday that the server I'm on was going down for memory replacement due to some bad memory that "has been causing problems". Since our site has been running like this as is for 4-5 years without any problems, I'm thinking that they wrongly blamed me for my code "bringing the server to its knees" when in fact there were problems with the server all along. Although obviously I still want to make sure the code is optimized. And I think everyone for their help.
Good luck with your code modifications. Should you need any further help, chat us up again soon =)
Those are small tables. Even a horrid job of querying shouldn't do much harm to the cpu.
It probably was their memory issue...