Forum Moderators: phranque

Message Too Old, No Replies

Website performance issues

         

glimbeek

1:06 pm on Mar 2, 2011 (gmt 0)

10+ Year Member



Hi,

I'm running a Joomla! website and I'm experiencing some performance issues.
I ran into this problem a while ago and after extensive testing the host figured out it was an issue with the hard-drive combined with a CPU that couldn't cope and to low memory. So they gave me a new server with a new hard-drive, twice the CPU power and twice the memory. It's a VDS in a cluster setup, so it should be able to "run" my website.

My website performed OK then, not great but it's a "big" site, sort off... 6 languages/different front-pages, around 3500 pages. So I figured it was as good as it was going to get back then.

Fast forward a year and my website is slow again. And by slow I mean, if I use host-tracker.com to check the average load times. I get an average load time of around 4 seconds.

I simple test that I did last year, and with which I proved to my host that it was the server and not the website, was to install a "clean" Joomla! with example data and is the same version of Joomla! and test that as I test my main page. Back then the clean Joomla! installation performed just as bad as my main site. This time around however the clean Joomla! installation has an average load time of 1.35 seconds according to host-tracker.com. So that leaves me with figuring out how to solve this for my main site.

What I did so far:
I turned on gzip/mod_deflate so I compress .html .php .css .js

What I could do is reduce the amount of requests, currently the slowest page (the one I tested the most) has 40 requests, most of which are images. Reducing the amount of request should make a difference even though the page itself is "only" 223 KB.

I use something that is called clicktale, which slows down the page a little. I also use Google Analytics, which of course slows down a little as well. And I load in 2 images from an external source by using javascript, this if course slows down a little as well. I'm perfectly aware that I won't get below a 1000ms for a request but getting below 2000ms or around 1500ms should be something that could be achieved.

I tested with:
host-tracker.com
webpagetest.org
Yslow (for Firebug)
Pagespeed (for Firebug)
and with Firebug itself.

Here come the questions:
The thing that strikes me as the most curious if I look at the Firebug -> Net results is that there's an average 1000ms "wait" before anything happens. So the first GET request takes on average a 1000ms. On the clean Joomla! installation this is "only" 500ms. Still seems a bit much to me but I could be wrong. Is this normal?

My host told me NOT to use absolute URL's for images, because that would lead to more requests. This seemed odd to me, but I tested this anyway and it made as far as I could tell, no difference. Should it make a difference if I use relative URL's for images instead of absolute URL's? Should this result into fewer requests AKA faster loading?

To stick with the images... is there a way to set caching for images? I read about setting a "cache" time for all images but can this be done for only certain images? If so, how?

Is there a way to cache a whole page? AKA make a shadow copy and serve that to the visitor instead forcing the server to "build" the entire page every time someone visits my website. So I can renew/update the shadow copy every 5 mins to reduce the load on the server.

Because it's a Joomla! website with a few extras there are 74 tables. Biggest table has 35000+ rows. This isn't that much is it? I emptied 2 tables with the most rows (2 tables which I could empty) and that made no difference to the page speed.

Next to that ,there are a few tables that need to be repaired according to my SQLyog tool. Should this make much difference? I'm not to keen on repairing them, as I do not know what will happen exactly.

I could put the images in a sprite, but how does this effect SEO for the images?

I'm no Unix/Apache wiz. So I was wondering if there are any logs/backup/cache/tmp files or folders that could slow down my server?

Last but certainly not least... Is there a easy way to properly check server load or bottle necks? Last time my host checked with a tool called Munin, but that didn't tell em much. They resulted in asking someone from cPanel to take a look at their server setup and he was the one that figured out that the server simply couldn't cope as it was setup.

Any hint/tip or help is greatly appreciated.

SEOtop10

8:07 am on Mar 26, 2011 (gmt 0)

10+ Year Member



Looking at the point of "first load takes a lot of time", I am wondering if the DNS system is resolving your domain quickly enough. What nameserver are you using?

phranque

11:18 am on Mar 26, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



i'm with SEOtop10 on this.
IanKelley suggested some ping tests.
first thing i would try is comparing a ping of the web hostname vs the web server's IP address.
then check your zone file config and make sure the times are sufficient to allow dns cacheing.

freejung

3:25 pm on Mar 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is there a way to cache a whole page?

Unfortunately I don't know Joomla, but I've done a fair amount of work to optimize the performance of a CMS-driven site recently and IMO this is by far the most promising idea you could be looking at. Investigate this and find a way to cache your pages if the content is static. If the content is truly dynamic, look into whether it is possible to do SQL query caching and cache the result sets of common queries to reduce database load. I have no idea how that would work in Joomla. I use MODx, which does very well in this respect.

Set that up first, then start looking at other things. I bet the speed gain to be had by caching is an order of magnitude greater than any other wins you can get at this point.

Edit: a quick search reveals that there are extensions for both page cache and query cache for Joomla.

philooo

5:47 pm on Mar 26, 2011 (gmt 0)

10+ Year Member



Simple answer, use a Content Delivery Network provider (CDN) there are plenty out there for decent prices.

First serve all your images from there. Then if you want to be more ambitious, you can try to even serve your html pages as well, but there you will need a more robust provider that will offer a 'processing rule engine' to allow you to bypass the CDN for special request.

Can get expensive but use cheap one for image (+- $0.05 Gb) and better one for more complex request (+- $0.25 Gb)

and with CDN you also don't have to worry about geographic location optimization. your server will load fast everywhere in the world...or almost everywhere ;)

IanKelley

9:08 pm on Mar 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Looking at the point of "first load takes a lot of time"


I must have skimmed right over that, good call. Another possibility would be that something on the page or in the scripts is contacting a slow external site, but only once for each visitor.

JAB Creations

11:24 pm on Mar 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



1.) Don't depend or use frameworks (client and server), they often make poor use of the language, enable many things you won't utilize and create excessive amounts of bulk. I would know, I've seen many framework driven sites on the same physical server as my own site's account. When I had to switch to the new host my response times TRIPLED because of other people's ignorance or neglect. If you have 3,500 page site then you're very vulnerable to reinforcing the thought process that you can't move away from the frameworks you may use. If you care about performance this would be the second place to start unless it effects your ability to cache.

2.) Cache. If your site is database driven than YOU are entirely responsible for enabling cache to work on any and every dynamic file. First start out with clearly static content (e.g. images) and work your way through the most common script files. My site in example has two script files for everything. The first script is index.js that includes all the static JavaScript functions. The second is onload.js that includes global variables that may change and the anonymous onload event function. I've even been able to cache the onload file by using a PHP session variable to add a ? and the Unix time stamp inside the script element to force the browser to load the onload.js file even if the cache has not expired if the user changes their preferences. It really doesn't get any better than that. To cache dynamic pages is subject to what appears on the page. Do you have headers and footers that change frequently? If so load them via AJAX and cache the page if the content doesn't change. Store content in a database? Store the last-modified Unix time stamp in the row with the page content and use that to generate a last-modified header. If your site is free of frameworks and your coding practices are exceptionally strict getting the cache to work can DRAMATICALLY reduce the number of file requests. The fewer file requests the fewer instances that the server's resources have to be utilized. I haven't fully implemented caching on my site though I've got it down to single URL loads per page request whereas it used to load several files per page request.

3.) Use the (X)HTML base element. If you can't run your site on your work computer's HTTP server (e.g. Apache) locally while the internet is not connected then WHOA! I test out everything locally and if it's broken on my computer it's not going to be uploaded to my live site until it's fixed and working. To do this effectively all you have to do is take advantage of your scripting language's ability to know your domain name.

I use two $cms class variables, $cms->base1 and $cms->base2. The reason?

Local...
$cms-base1 = 'http://localhost/my_site/version_2.9';
$cms-base2 = '/';

$cms-base1 = 'http://www.example.com';
$cms-base2 = '/';

The base element will work in conjunction with image, link and script elements. The base element is used first so the content related elements should merely have the second part of the URL to ultimately form an absolute URL.

So in example...
$cms-base1 = 'http://localhost/my_site/version_2.9';
$cms-base2 = '/';

Translates in to...
<base href="http://localhost/my_site/version_2.9/" />

Added with...
<img src="images/happy_widgets.jpg" />

Becomes...
http:// localhost/my_site/version_2.9/images/happy_widgets.jpg

This makes testing locally exactly like on the live server minus any server specific issues (e.g. not having a specific PHP module installed) though it eradicates most other problems and really makes development much smoother.

4.) Debug PHP. I HIGHLY recommend Xdebug used in conjunction with WinCacheGrind. You can capture the time in milliseconds for a page to be compiled by the server and explore where the most time is being used. My site compiles locally in under 60 milliseconds however the live servers are loaded and it takes I'd guesstimate about 200 milliseconds because of other people's sites (shared server). Your goal regardless of how many pages you have is to knock your compiling time to under 100 milliseconds locally if you have a decent machine and if it's Windows very optimized. In fact I'd aim for 80ms or less if possible. Humans can notice a delay of about 200 milliseconds and I clearly noticed the difference when I changed hosts. You'll likely find that the time using the debugger will either be sucked up by a few lines of code or if nothing specifically is heavy it will likely be the framework's mere presence.

Those are the things I would check personally if I was dealing with a site acting slowly. Trust me, it's usually the software and not so much the hardware especially these days unless someone really borks the hardware configuration.

- John

shri

3:14 pm on Mar 27, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We delivery several millions of pages worth of Joomla pageviews every month - over dozens of sites and all our sites are far more complicated than pretty much most stock Joomla installs. Pages are timed at 1 second load times if we cannot achive that, it is back to the drawing boards for us.

Reason I mention this - every time a CMS is brought yp, there will be critics, hand coders, dreamweavers and a truck load of purists who will find ways to blame the CMS and not the stack.

Ian Kelley has the right approach in my opinion.

1) Create a static 50K html page, no images. How long does that take to load.

2) Create a PHP page which calls a few dozen PHP functions loops a few times and spits out about 50K worth of data. How long does that take?

3) Create a PHP page which calls mysql_connect and mysql_select_db a few dozen times and how long does that take?

Those might give you some answers.

Next look at your stack.

1) Your drives: Are you getting reasonable read and write speeds?

( I might recommend some TCP tuning, but that would be over kill as step 2)

2) Your web server: Does it have enough processes to serve your busy site? Could your requests be queued?

3) Your PHP stack: Are you running a binary cache like xcache? Are you able to log slow running php queries ( if you're on a php-fpm install you can .. ) - this is without doubt one of the best points to start your code review.

4) Your joomla install. Does a barebones out of the package Joomla install on the same server give you better response times? Have you disabled modules and components and gotten a better or worse response? Some modules are horribly coded and result in lots of lags.

5) Your mysql stack: Have you got slow query logs turned on? Does your mysql have query caching turned on? Enough memory for various indexes etc? In general, I prefer to run mysql out of memory ( i.e. enough system ram for as much of the data as possible... )

Don't start hating your CMS until you've proven that it is really the problem.

Most of them time, when your first response in Firebug's Net console is a second late, you've got some system issues like slow responding DBs or PHP mucking up.

After youv'e verified you don't have server side problems... go looking for client side issues like compressed output, css, js etc. Here you'll need to look at CDNs, minification, gzip compression etc. Another day, another topic.

On the Joomla end, you should turn on its xcache or memcache caches if you do have the ability to.

jskrewson

4:12 pm on Mar 27, 2011 (gmt 0)

10+ Year Member



I think you are running in a VM Ware session on a Virtual Dedicated Server. So you are sharing disk and cpu resources with other virtual instances on the same computer.

Some of the hardware specs you printed appear to be for the entire server and not just your virtual instance. Has your host overloaded this server?

I would never run under virtualization if my website was making decent money.

robzilla

7:54 pm on Mar 27, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Consider installing a network monitoring tool such as Munin to see (roughly) how the resources allocated to your virtual server are used throughout the day (memory, cpu usage, bandwidth, etc). On one of my virtual servers with burstable CPU, it even shows me the percentage of the physical server's total CPU power that's being used by the other virtual servers on that machine.

SlickSolutionsInc

3:34 am on Mar 29, 2011 (gmt 0)

10+ Year Member



I saw some hardware stats that seemed "normal" given the number of hits. My experience working with clients is a database with 74 tables is probably where you need to start given some of these advice and other things attempted.
You are committed to the current design so the biggest quickest bang for the buck is to look at indexes vs queries(where clause, join on, foreign keys, order by clause). Of course all tables need a primary key, natural or surrogate. If you use surrogates you definitely need indexes that satisfy ALL the queries. Most databases have a way to monitor index statistics so you should be monitoring these daily/weekly to understand the trend and adjust the monitoring accordingly.

Nice thread good question. I need to study up on relative vs absolute urls while I got my DBA hat on. :)

glimbeek

5:45 am on Mar 29, 2011 (gmt 0)

10+ Year Member



WOW!

Thanks for the massive replies! A lot of thing to look into. Which I will be doing today :)

I'll keep you updated.

glimbeek

8:25 am on Mar 29, 2011 (gmt 0)

10+ Year Member



Update:

Trace, if you could send me that "best practice" document that would be great! Or send me a PM or something, so I can give you my email address.

All the images I use have the correct size AKA they don't get scaled by HTML and are "Saved for Web" using Photoshop on 60% quality settings. So they are optimized.

All my Javascript is in one file, which get's cached now (on my test domain) as well as my single CSS file. See previous posts about expire headers, which I'm using now on my test domain. I also optimized those files, by removing any space or comments (reduced size).

Have you tried a reverse proxy? mod_cache has done miracles for me.

I'm using expire headers now. Is this then enough, am I using mod_cache now? I'm completely new to all this.

I tested a "clean" Joomla! install, same version but with default content. That loads on average in 1.5 seconds.
I also tested the slow website on a dedicated server I have with same host and on that dedicated server my slow website loads on average in 1.8 seconds. Does this mean the current server can't cope?

Create a very basic HTML page. No PHP, no ads, just a static HTML file. Put some images and javascript in if you like but nothing that contacts an external site (i.e. feeds, ads).

You can rule out Joomla by creating a copy of a slow page as a static HTML file and testing it on it's own. In other words the page would have the same HTML/Javascript/CSS/Images that your CMS is outputting, but you would be serving it as an HTML file instead of through PHP.

I tested my frontpage of my slow website as static content and that loads on average in 0.54 seconds (on the same server).

Run some ping tests from multiple locations and see what kind of latency there is. If you can determine that there is dramatically higher latency from certain locations then that would point to a network issue at the NOC or one of their providers.

As I am unable to test from different locations, I used host-tracker.com for this as well. Seems logical to stick to the same testing tools as well... Id did go to [<server-ip>...] instead of just the IP, which is a default Apache page.
With this test I got an average of 0.325 seconds. This is basically a static content request, right? Just as did with a static version of my "slow" website frontpage.

Looking at the point of "first load takes a lot of time", I am wondering if the DNS system is resolving your domain quickly enough. What nameserver are you using?

I'm using eukdns.com nameservers.

Is there a way to cache a whole page?

I looked into Joomla! solutions for this, they don't really suit my needs. I was looking more for a way to this with Apache. If that's possible?

I must have skimmed right over that, good call. Another possibility would be that something on the page or in the scripts is contacting a slow external site, but only once for each visitor.

Yeah there's Google analytics, Clicktale, and 2 external javascripts to load in 2 different images. I am looking into this, to make the 2 javascripts files load locally. The images can't be done, they change dynamicly.

Debug PHP. I HIGHLY recommend Xdebug used in conjunction with WinCacheGrind.

As I know close to nothing about (l)unix, I doubt I can get this to work. Next to that I'd have to install and test it on a server that's running a live website...

1) Create a static 50K html page, no images. How long does that take to load.

Tested (sort off), see above.

2) Create a PHP page which calls a few dozen PHP functions loops a few times and spits out about 50K worth of data. How long does that take?

I'll have to look into that, all though I doubt this will take long as a "default" Joomla! install runs "fast" enough.

1) Your drives: Are you getting reasonable read and write speeds?

How/where can I test this? I'm on a WHM VPS.


2) Your web server: Does it have enough processes to serve your busy site? Could your requests be queued?

How/where can I test this? I'm on a WHM VPS.

3) Your PHP stack: Are you running a binary cache like xcache? Are you able to log slow running php queries ( if you're on a php-fpm install you can .. ) - this is without doubt one of the best points to start your code review.

I'll look into this, but like I said. I know next to nothing about (l)unix and I'm hesitant to "try this out" on a server which is running my live website.

4) Your joomla install. Does a barebones out of the package Joomla install on the same server give you better response times? Have you disabled modules and components and gotten a better or worse response? Some modules are horribly coded and result in lots of lags.

Tested above. What I could try is install the most "heavy" extentions on this one at a time and test this after each install.

5) Your mysql stack: Have you got slow query logs turned on? Does your mysql have query caching turned on? Enough memory for various indexes etc? In general, I prefer to run mysql out of memory ( i.e. enough system ram for as much of the data as possible... )

As far as I can tell mysql query cache is on, but it can be increased. PHPMYADMIN is showing up some things in red. I still need to look into this. Things like query cache, queries that take to long because off bad/damaged indexes.

Don't start hating your CMS until you've proven that it is really the problem.

I don't! I use Joomla! on most off my websites and 99% of those run as smooth as silk. This slowness is limited to this one site. And like I said, a default Joomla! install with default data seems to run just fine. So Joomla! itself surely isn't the problem.

On the Joomla end, you should turn on its xcache or memcache caches if you do have the ability to.

I'll look into this.

I think you are running in a VM Ware session on a Virtual Dedicated Server. So you are sharing disk and cpu resources with other virtual instances on the same computer.

Yeah it's cPanel on WHM install on a VPS, which is on a cluster. And my host assures me it isn't the server, but that's what they said last time as well...

Consider installing a network monitoring tool such as Munin...

Same issues as above, don't have the knowledge.
Clearly this is the way to go though... I'll look into getting someone that can help me out with this.

You are committed to the current design so the biggest quickest bang for the buck is to look at indexes vs queries...

I checked my DB and it does have some issues. Some tables need to be repaired, it seems some of the indexing is wrong. For instance, the amount of table rows in some of the indexes is wrong.
If I would "repair" this, (by using the repair command) could this go horribly wrong? None of the data in the tables seems to be damaged.

Cheers for all the suggestions and tips. They raised a lot of new questions from my side and they gave me a lot of new things to look into.

One last thing (so far)..
What strikes me as odd. Using host-tracker.com you can see which "host" takes a long time. For instance a host from SPb, RU takes 12.64 seconds but a differnet host from SPb, RU takes 4.18 seconds. Or a host from Moscow, Russia takes 6.48 seconds, while a different host from Moscow, Russia takes 1.04 seconds. Going to the States: Dallas, TX, US one takes 10.1 seconds while the another takes 3.44 seconds. Amsterdam, Netherlands takes 8.04 seconds, while my server is located in Amsterdam, Netherlands!

Roubaix, France, one takes 0.75 seconds the other 1.43.
Bangkok, ThailandBangkok, Thailand only took 0.38 seconds.

This is all from the same test. How can one take 12.64 seconds while the other takes 0.38 seconds. All the hosts check the website in a one go/a row.

mryon_z

10:34 am on Mar 29, 2011 (gmt 0)

10+ Year Member



yup, i like joomla, until now, i am using joomla for my blog

Maurice

3:52 pm on Mar 29, 2011 (gmt 0)

10+ Year Member



from memory this is what I did in the past

Turn on cacheing in site->global config

Turn off stats again on the site->global config menu.

For all of your modules enable caching (if the module suports it)

ergophobe

8:49 pm on Mar 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Debug PHP. I HIGHLY recommend Xdebug used in conjunction with WinCacheGrind.

As I know close to nothing about (l)unix, I doubt I can get this to work.



You don't want to do this on a live site anyway. It will slow the site to a crawl, because you'll be writing CacheGrind logs to disk with every function call (and there will be thousands for a Joomla page).

What you want to do is get the site running on a testbed server, install XDebug and CacheGrind or WinCacheGrind (depending on OS). There are some PHP functions that are slow on Windows that are fast on *nix b/c they run natively on the one but not the other, but you can still try it on Windows and look for bottlenecks.

You want to look for
1. function call that is taking a huge amount of time to complete, which likely means that it is calling a slow query

2. custom function call (i.e. not native PHP) that isn't that slow, but is being called hundreds or thousands of times.

Figure out how to optimize it or do without the feature that is resulting in so many calls. It's a firehose of data, but it can be real eye opening.

phranque

5:31 pm on Mar 30, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



have you tried comparing a ping of the hostname vs a ping of the web server's IP address?

Demaestro

6:58 pm on Mar 30, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Do you have any third party components installed?

Virtuemart I know can sometimes double page load times. It does some weird stuff writing to TEMP tables.

Lorel

11:49 pm on Mar 30, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Have you checked your code for Code Bloat? CMS programs are notorious for the same.

phranque

1:15 am on Mar 31, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



you're getting lots of great suggestions for optimizing your server and site, but if in fact the problem is a long wait for the first HTTP GET request, then it doesn't matter how fast your CMS, db, page rendering, etc is.

Seb7

11:10 pm on Apr 8, 2011 (gmt 0)

10+ Year Member



I'm with phranque on this. Everyone seems to be ignoring the 1 second request time. 1 second is far too long for a server to respond. It should be milli seconds. A server of this spec should not be this slow. Something is seriously wrong with the server.

Your server does seem to be shared with other websites, as it's running vmware on it. You can check this, as there are some websites out there which tell you how many websites are running from a single ip address.

I would also like to mention that you should put any analytics code at the bottom of the page.

phranque

5:59 am on Apr 9, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I would also like to mention that you should put any analytics code at the bottom of the page.


if you are using the newer(ish) asynchronous analytics code, it goes in the head.

http://code.google.com/apis/analytics/docs/tracking/asyncMigrationExamples.html#migrationInstructions [code.google.com]:
Insert the asynchronous snippet at the bottom of the <head> section of your pages, after any other scripts your page or template might use. One of the main advantages of the asynchronous snippet is that you can position it at the top of the HTML document. This increases the likelihood that the tracking beacon will be sent before the user leaves the page. We've determined that on most pages, the optimal location for the asynchronous snippet is at the bottom of the <head> section, just before the closing </head> tag.

IanKelley

9:49 pm on Apr 10, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Everyone seems to be ignoring the 1 second request time.


He was able to reduce the request time by using a static HTML page instead of his CMS.

It was still high though... So it's a combination. His CMS, or something he's doing with his CMS is inefficient. But that's not the whole issue. Something could be wrong with the server, but it could also be a network issue.

Try this... Stick the site on a friend's server a see what happens. If it runs significantly faster there, switch to a new host and forget about trying to track this issue down.
This 52 message thread spans 2 pages: 52