| 9:36 am on Feb 8, 2012 (gmt 0)|
A site is behaving quite strangely. Sometimes it's fast, sometimes it's slow as hell.
It's on a shared server. It has music that plays through jPlayer on the homepage, it has a voting system that counts votes on profile page load, it counts plays of specific songs on profile page load, it sums the plays for a total plays figure on profile page load, it loads a specific player other than jPlayer on profile pages.
I've been looking at a few things and nothing is coming up that would make it load slowly once but then load fast another.
Does anyone have any useful suggestions of what to be looking at for speed concerns? Is it because it's a shared server? Is it the music players streaming songs?
| 10:27 am on Feb 8, 2012 (gmt 0)|
|Is it because it's a shared server? |
This is quite possible. Some hosts put 1000's of sites on one server in a shared environment and everyone's competing for the same resources. Any one site could potentially bring down the entire server.
| 2:23 pm on Feb 8, 2012 (gmt 0)|
Are you renting a VPS (virtual private server) or are you on a meager packaged single-domain hosting plan? If the former then you ought to be able to SSH into the server (Linux command-line terminal) and snoop around. In particular, you can do the "top" command to see CPU and memory resources at any given moment.
If the latter then sometimes all you get is FTP access to a "public_html" folder, and some kind of control panel to manage your database. And you're at the mercy of shared resources.
keep in mind, it could also be your own connection to the internet that's dodgy. Mine at home is highly variable, sometimes I get throttled, and my speed depends a lot on whether someone is gobbling bandwidth to watch cartoons on NetFlix while I'm working.
On those shared servers, you're usually not even getting your own mySQL instance. Instead you're just getting one database (with all your tables in it) on the same IP:port as 1000s of other users also with their own one database each. That means a long unoptimized query (or the notorious mysqldump) on someone else's site will consume CPU & memory, your queries get queued & delayed, and it starts a domino effect that ultimately means your page takes a long time to render.
I switched from microhosting packages to renting my first VPS in the early 00's and it was the best decision ever. Take a look at the prices for Cloud Servers these days, they're as cheap as a managed single-domain shared plan was 5 years ago
playing large audio files will hog bandwidth, CPU & memory, especially if the audio files are huge. Definitely consider moving that content to a CDN instead of having it stored on your web server.
| 4:25 pm on Feb 8, 2012 (gmt 0)|
I've found that in shared environments it's usually the connection to the (almost always) remote mySQL server. Try loading some static pages and see if it's still slow. Either way it's going to be the limitation of shared hosting, so the point is moot.
I will say though, if it's a Windows shared server you will see improvements moving to Linux shared. Not enough, but some.
| 6:31 am on Feb 13, 2012 (gmt 0)|
Thanks for the replies.
I've found what it is. It's a ranking system based on votes for an item. So basically everytime the page loads, it has to go through a table of 8 or 9MB, count the votes by IDs, then order them to get a rank and that's causing lag.
So I have a couple of ideas.
1) Create a stored procedure for it
2) Scrap the current method and create a table with one entry for each item where everytime a vote is received for an item, a row, say, item_votes, is incremented by 1 and then just read that table and order by item_votes where rank = $key + 1
3) Create a table with a cron job that runs once a day in the wee hours, store the results and spit those out for the day.
Any comments/suggestions would be great!
| 8:10 am on Feb 13, 2012 (gmt 0)|
|Sometimes it's fast, sometimes it's slow as hell. |
|...everytime the page loads, it has to go through a table of 8 or 9MB,... |
Doesn't this imply that it is always slow? Although may be only when the site is getting more traffic...?
| 8:29 pm on Feb 13, 2012 (gmt 0)|
and if that doesn't fix it...
how often does the ranking change? Does the result have to be immediate & fresh?
I had a similar problem last year, involving a query that took about 30 seconds to complete. It was horrible, a big knot of JOINs and full table scans and WHERE clauses with functions in them. Indexes were useless. The tables were enormous. It seemed hopeless.
But, thankfully the data didn't need to be fresh every second. It did change, but not second-to-second. A latency was tolerable.
First, I optimized the query as much as I could... there wasn't a lot I could do, but a couple of indexes and changes to the query trimmed the time from 30 to under 15 seconds. That was an educational exercise just in itself. Still not good enough, but, better.
Then, I wrote a PHP script that would do that heavy SQL query, and cache the result into a separate table. Just to be extra keen, the result was also knitted into HTML output, and that was put into memcache. Then I put that "data cache refresh" script on a 5 minute cron.
On page load, the content was grabbed from memcache first. If for some reason it wasn't there, it would grab the pre-processed data from the temporary table. Either way, any request would re-populate memcache if the cache was missing. If for some reason the cron wasn't running, then the temp data wouldn't change, the timestamp in the data would get old, and if it was too old... an email would be fired off to a panic address (mine), indicating that I needed to intervene. That never actually happened.
and it was ULTRAfast, like, millisecond-fast.
The technique was so intoxicating I started doing it with lots of other things.
Be careful running ANY query that takes a long time to execute. Does it INSERT or UPDATE, or is it only a SELECT? Queries that write can lock up entire MyISAM tables, and queries that COUNT() are brutal on InnoDB tables.
Optimizing SQL queries is a sport. People spend entire careers mastering it.
BTW, a 9MB table isn't that big... the ones I was working with were > 800MB with tens of millions of rows
| 8:32 pm on Feb 13, 2012 (gmt 0)|
|3) Create a table with a cron job that runs once a day in the wee hours, store the results and spit those out for the day. |
If you only need to refresh the data once a day, then this is definitely your best option.
I'd do that even if site performance wasn't noticeably bad.