homepage Welcome to WebmasterWorld Guest from 54.211.164.132
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Code, Content, and Presentation / Apache Web Server
Forum Library, Charter, Moderators: Ocean10000 & incrediBILL & phranque

Apache Web Server Forum

This 52 message thread spans 2 pages: 52 ( [1] 2 > >     
Website performance issues
glimbeek




msg:4275230
 1:06 pm on Mar 2, 2011 (gmt 0)

Hi,

I'm running a Joomla! website and I'm experiencing some performance issues.
I ran into this problem a while ago and after extensive testing the host figured out it was an issue with the hard-drive combined with a CPU that couldn't cope and to low memory. So they gave me a new server with a new hard-drive, twice the CPU power and twice the memory. It's a VDS in a cluster setup, so it should be able to "run" my website.

My website performed OK then, not great but it's a "big" site, sort off... 6 languages/different front-pages, around 3500 pages. So I figured it was as good as it was going to get back then.

Fast forward a year and my website is slow again. And by slow I mean, if I use host-tracker.com to check the average load times. I get an average load time of around 4 seconds.

I simple test that I did last year, and with which I proved to my host that it was the server and not the website, was to install a "clean" Joomla! with example data and is the same version of Joomla! and test that as I test my main page. Back then the clean Joomla! installation performed just as bad as my main site. This time around however the clean Joomla! installation has an average load time of 1.35 seconds according to host-tracker.com. So that leaves me with figuring out how to solve this for my main site.

What I did so far:
I turned on gzip/mod_deflate so I compress .html .php .css .js

What I could do is reduce the amount of requests, currently the slowest page (the one I tested the most) has 40 requests, most of which are images. Reducing the amount of request should make a difference even though the page itself is "only" 223 KB.

I use something that is called clicktale, which slows down the page a little. I also use Google Analytics, which of course slows down a little as well. And I load in 2 images from an external source by using javascript, this if course slows down a little as well. I'm perfectly aware that I won't get below a 1000ms for a request but getting below 2000ms or around 1500ms should be something that could be achieved.

I tested with:
host-tracker.com
webpagetest.org
Yslow (for Firebug)
Pagespeed (for Firebug)
and with Firebug itself.

Here come the questions:
The thing that strikes me as the most curious if I look at the Firebug -> Net results is that there's an average 1000ms "wait" before anything happens. So the first GET request takes on average a 1000ms. On the clean Joomla! installation this is "only" 500ms. Still seems a bit much to me but I could be wrong. Is this normal?

My host told me NOT to use absolute URL's for images, because that would lead to more requests. This seemed odd to me, but I tested this anyway and it made as far as I could tell, no difference. Should it make a difference if I use relative URL's for images instead of absolute URL's? Should this result into fewer requests AKA faster loading?

To stick with the images... is there a way to set caching for images? I read about setting a "cache" time for all images but can this be done for only certain images? If so, how?

Is there a way to cache a whole page? AKA make a shadow copy and serve that to the visitor instead forcing the server to "build" the entire page every time someone visits my website. So I can renew/update the shadow copy every 5 mins to reduce the load on the server.

Because it's a Joomla! website with a few extras there are 74 tables. Biggest table has 35000+ rows. This isn't that much is it? I emptied 2 tables with the most rows (2 tables which I could empty) and that made no difference to the page speed.

Next to that ,there are a few tables that need to be repaired according to my SQLyog tool. Should this make much difference? I'm not to keen on repairing them, as I do not know what will happen exactly.

I could put the images in a sprite, but how does this effect SEO for the images?

I'm no Unix/Apache wiz. So I was wondering if there are any logs/backup/cache/tmp files or folders that could slow down my server?

Last but certainly not least... Is there a easy way to properly check server load or bottle necks? Last time my host checked with a tool called Munin, but that didn't tell em much. They resulted in asking someone from cPanel to take a look at their server setup and he was the one that figured out that the server simply couldn't cope as it was setup.

Any hint/tip or help is greatly appreciated.

 

glimbeek




msg:4275233
 1:11 pm on Mar 2, 2011 (gmt 0)

Some additional tech info:
Database Version: 5.1.54
Database Collation: utf8_general_ci
PHP Version: 5.2.16
Web Server: Apache/2.2.17 (Unix) mod_ssl/2.2.17 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635
Joomla! Version: Joomla! 1.5.11 Production/Stable [ Vea ] 03-June-2009 03:30 GMT

ergophobe




msg:4275304
 3:06 pm on Mar 2, 2011 (gmt 0)

Absolute URLs are going to force the request to go through http, rather than through the much faster file system.

YOu might have a look at these for speeding up rewrites
[webmasterworld.com...]
[webmasterworld.com...]

You could look into things like the slow query log and things like that, but it sounds like your performance issues are happening before the first database query is even sent. Figuring out that sort of bottleneck is beyond my experience.

jdMorgan




msg:4278503
 9:20 pm on Mar 8, 2011 (gmt 0)

Let's clarify one point here: It makes little difference whether you use page- or server-relative or absolute (canonical) URLs in links on your pages to include objects such as images, CSS files, and external JavaScript files -- These things are included on the page as it is rendered by the browser, and so by definition, must be requested by the browser using HTTP. It is the browser (HTTP client) that resolves page- and server-relative links to the canonical URLs that it must use to make a request to your server.

A short session with the Live HTTP Headers add-on for Firefox will show the above to be true.

What makes a big difference is the method used for "script includes" -- the server-side inclusion of "pieces of common script code" shared among several page-generation scripts. These includes are done before the HTML page is generated and sent to the client. These includes should never be done using HTTP URLs, as the result will be that the server sends HTTP requests to itself while generating an HTML page for output -- which is likely awesomely slow.

Performance problems can be caused by dozens of factors. But I agree with ergophobe in that the threads on Joomla's known .htaccess code efficiency problems and g1smd's Joomla contributions to fix those problems would be a good place to start.

Jim

glimbeek




msg:4278690
 6:47 am on Mar 9, 2011 (gmt 0)

Thank you for the replies I will take a look at those links.

ergophobe




msg:4279109
 8:59 pm on Mar 9, 2011 (gmt 0)

These things are included on the page as it is rendered by the browser,


doh! Obviously... I get so used to thinking of that in terms of server side scripting. My bad!

jdMorgan




msg:4279271
 1:27 am on Mar 10, 2011 (gmt 0)

No "Doh!" at all. I just thought it important to distinguish "client-side-include image/CSS/JS link URLs" from "server-side-include script filepaths" to clarify what you said above...

The "Doh!" only comes in when you look up a "pest" that keeps appearing in your access logs requesting your scripts over and over again all day long, and discover that it's . . . Oh my! . . . your own server's IP address! Major "Doh!," that one...

Jim

ergophobe




msg:4279872
 8:41 pm on Mar 10, 2011 (gmt 0)

:-) Funny.

"Doh" because it was obviously a client-side request (images on the page), but I think I had the wrong hat on... or something like that.

Of course, that leaves open the question of why his hosting service would imagine that absolute vs relative URLs for images on the page would make any difference at all.

glimbeek




msg:4280102
 7:18 am on Mar 11, 2011 (gmt 0)

I haven't had the time to look into this. Working on a multilingual comment system. When that is done I'll get back to this.

Yes well I guess the host isn't that great.. even though they have some major clients. Thanks for keeping the discussion going, some very useful information indeed.

glimbeek




msg:4282388
 12:27 pm on Mar 16, 2011 (gmt 0)

Getting back to this now...

The replies in this topic where all based upon the relative versus absolute URL 's.

What about my other "questions"?

For instance:I'm no Unix/Apache wiz. So I was wondering if there are any logs/backup/cache/tmp files or folders that could slow down my server?

I hope that someone can help me.

ergophobe




msg:4282553
 4:42 pm on Mar 16, 2011 (gmt 0)

To stick with the images... is there a way to set caching for images?


Use the ExpiresByType directive
[httpd.apache.org...]

Is there a way to cache a whole page?


Many. Most CMS have caching plugins, but I don't know Joomla. You might simply be able to add a plugin that would improve page caching.

You can also use brute force if you don't have authenticated users. Authenticated users with customized pages pose a problem for any full-page caching, but on one site where the front page took tons of processing power, I just checked for a cached version, if not available, I turn on output buffering, generate the page as normal, display the page, then capture the buffer contents, write them to a file with a name based on the URL (so I can check for it) and then close

[php.net...]

Since writes to logs are appends, I don't think large log files would noticeably slow things down, but you could always delete your log files and see.

Also, see if Joomla has any developer tool plugins. Since you're on shared hosting, you won't have access to the MySQL slow query log, but if Joomla uses a database abstraction layer, it's possible that info could be available in some form.

When a CMS-driven site is slow, I always suspect the number of queries and perhaps one slow query to be the culprit.

You never know though. The site I mentioned above with the slow front page was grinding to a halt because I was actually using the PHP GetImageSize function to generate proper HTML for images with height and width and it was just burning CPU time doing the image processing for no significant gain in rendering time - an example where it's worth offloading the work to the front end.

ergophobe




msg:4282559
 4:47 pm on Mar 16, 2011 (gmt 0)

ps - if you determine that your site is slow because of Joomla's rendering times, then Xdebug and Cachegrind are your friends (for profiling). Don't do profiling on a live site though - it will grind things to a halt.

glimbeek




msg:4282851
 7:27 am on Mar 17, 2011 (gmt 0)

Hi ergophobe,

Thank you for the replies.

To give some more info, here are the VPS specifications as we get to use it. This should be for "our" server, even though it's a VPS.

Processor Information

Total processors: 2

Processor #1

Vendor
GenuineIntel

Name
Intel(R) Xeon(R) CPU E5504 @ 2.00GHz

Speed
1995.000 MHz

Cache
4096 KB

Processor #2

Vendor
GenuineIntel

Name
Intel(R) Xeon(R) CPU E5504 @ 2.00GHz

Speed
1995.000 MHz

Cache
4096 KB


Memory Information

Memory for crash kernel (0x0 to 0x0) notwithin permissible range
Memory: 3892488k/3932160k available (2161k kernel code, 38380k reserved, 900k data, 228k init, 3014592k highmem)

Physical Disks

hda: VMware Virtual IDE CDROM Drive, ATAPI CD/DVD-ROM drive
SCSI device sda: 41943040 512-byte hdwr sectors (21475 MB)
sda: test WP failed, assume Write Enabled
sda: cache data unavailable
sda: assuming drive cache: write through
SCSI device sda: 41943040 512-byte hdwr sectors (21475 MB)
sda: test WP failed, assume Write Enabled
sda: cache data unavailable
sda: assuming drive cache: write through
sd 0:0:0:0: Attached scsi disk sda
hda: ATAPI 1X CD-ROM drive, 32kB Cache, UDMA(33)
sd 0:0:0:0: Attached scsi generic sg0 type 0



Current Memory Usage

total used free shared buffers cached
Mem: 3896096 2404464 1491632 0 265096 1824312
-/+ buffers/cache: 315056 3581040
Swap: 1429776 68 1429708
Total: 5325872 2404532 2921340


This should be able to cope with a Joomla website, shouldn't it? I "only" get around 300 visits and 600 page views a day. That's not much, is it? The above hardware should be able to cope with those numbers? Keep in mind I don't use heavy apps that require a lot of javascript.
There's only 1 javascript file I load. IT's for the superfish menu and it's 32.2kb/s after compression. I load in google analytics stuff as well...
The biggest image I have is 39.5kb. The webpage in total after compression is 215.2KB.

"Next to that ,there are a few tables that need to be repaired according to my SQLyog tool. Should this make much difference? I'm not to keen on repairing them, as I do not know what will happen exactly."

glimbeek




msg:4282971
 2:48 pm on Mar 17, 2011 (gmt 0)

I "figured out" how to use expire headers:

# 1 YEAR
<FilesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav)$">
Header set Cache-Control "max-age=29030400, public"
</FilesMatch>

# 1 WEEK
<FilesMatch "\.(jpg|jpeg|png|gif|swf)$">
Header set Cache-Control "max-age=604800, public"
</FilesMatch>

# 3 HOUR
<FilesMatch "\.(txt|xml|js|css)$">
Header set Cache-Control "max-age=10800, public"
</FilesMatch>

# NEVER CACHE
<FilesMatch "\.(html|htm|php|cgi|pl)$">
Header set Cache-Control "max-age=0, private, no-store, no-cache, must-revalidate"
</FilesMatch>

But is this really needed? Don't browser nowadays cache images/etc.. by default anyway?

And is there a way another way to do this? I know that you can add this to the apache .conf file, but can it be done by using PHP for instance? And what would be the best approach?

ergophobe




msg:4283138
 5:33 pm on Mar 17, 2011 (gmt 0)

All I can say is that if you're having performance problems, you'll need to look at it wholistically

- watch the server. What does "top" return?
- look at your raw server logs. Perhaps robots are hammering your site.

These are unlikely explanations, but since they take 2 minutes, why not start there.

Then it's time to get real.


- start profiling. This is usually where I start because this is where you get huge results, at least in the first iteration. You might have a small feature on your site ("Who's online" is often an offender depending on how implement) that is resulting in hundreds of function calls for one small feature or possibly a very long query that is taking a long time.

One problematic query can account for 90% of page generation time even if there are 50 queries and 100 file includes to generate a page. Sometimes it's a simple matter of joining on a column that is no indexed.

This goes to your damaged tables question. If a table that has text or varchar columns (i.e. variable length columns) and is not indexed or the index is damaged and you're joining on a couple of these columns, you could be dragging things a lot. On the other hand, if it's a table that only gets queried when you're in the admin view and isn't even invoked for regular users, it's unlikely to be the source of your problem.


- stress test it with static file requests (i.e. take Joomla out of the equation). Maybe your host has a node that's getting hammered and your site and server has nothing to do with it. Check out my post here: [webmasterworld.com...]

- if the global measurements are good with static files, then it's not likely a network problem or fundamental server problem, but a site problem. Benchmark again on a couple of different page types.

g1smd




msg:4283250
 8:11 pm on Mar 17, 2011 (gmt 0)

Slightly off-topic, but Joomla related, I'd appreciate someone testing the ".4" version of the new .htaccess files available for 1.5 [joomlacode.org...] and 1.6 [joomlacode.org...] as these should noticeably increase performance.

In addition, there's a set of "security" options that can be added on. These are at [docs.joomla.org...] and I have made more than 100 changes to that code in the last few weeks: [docs.joomla.org...]

jdMorgan




msg:4283430
 1:40 am on Mar 18, 2011 (gmt 0)

Not off-topic at all. In fact, the changes g1smd has made to the Joomla .htaccess code may shave a second off your page-load times -- or maybe even fix your problem completely.

The original Joomla code was very inefficient. It was "correct" as far as it goes, but the author apparently didn't think in depth about what that code actually would do on a live server -- just dashed off the code and went to work on something "more important". As a result "every single HTTP request to the server* results in at least two filesystem reads to check for "file exists." On a server that is suffering thrashing, this may mean at least two and up to three physical disk reads per HTTP request.

The WP code, and the code for several other CMSes suffers from the same basic design flaw.

So, no. Not off-topic at all...

Jim

glimbeek




msg:4283563
 8:57 am on Mar 18, 2011 (gmt 0)

First off..

Thank you everybody for the extensive replies!

To get back to my image caching question. Does this still work? Is this useful? Or can I just ignore this?

To reduce the amount of requests, I made a sprite. By doing this I went from 40 requests 16, which is a vast improvement. All though the sprite itself is 118KB big, which is a twice the size the individual images are all together. Is there a way around this?

top / processes
I check the Process Manager in WHM:
<my-user> differs from 0% up too 44% CPU usage
and mysql has consistent 8.7% CPU usage

Daily process log for 17-03-11:
You have 2 CPUs; therefore, these CPU percentages are divided by 2 to indicate the true percentage of all CPU power used.

UserDomain% CPU% MEMMySQL Processes
mysql 4.47 0.40 0.0
<my-user>3.26 1.03 1.7
nobody 0.05 0.23 0.0
root 0.03 2.60 1.9
mailman 0.01 0.00 0.0
<my-user> 0.00 0.52 0.0
named 0.00 0.10 0.0
68 0.00 0.00 0.0
avahi 0.00 0.00 0.0
dbus 0.00 0.00 0.0
mailnull 0.00 0.00 0.0
eximstats 0.00 0.00 0.0
unauthenticated 0.00 0.00 0.2
leechprotect 0.00 0.00 1.0

Top Processes
UserDomain% CPU &#9652;Process
<my-user> 60.0% /usr/bin/php /home/<my-user>/public_html/index.php
<my-user> 55.0% [php]
<my-user> 52.0% [php]
mysql 9.2% /usr/sbin/mysqld --basedir/ --datadir/var/lib/mysql --usermysql --log-error/var/lib/mysql/server.mysite.com.err --pid-file/var/lib/mysql/server.mysite.com.pid
mysql 9.1% /usr/sbin/mysqld --basedir/ --datadir/var/lib/mysql --usermysql --log-error/var/lib/mysql/server.mysite.com.err --pid-file/var/lib/mysql/server.mysite.com.pid
mysql 9.0% /usr/sbin/mysqld --basedir/ --datadir/var/lib/mysql --usermysql --log-error/var/lib/mysql/server.mysite.com.err --pid-file/var/lib/mysql/server.mysite.com.pid
root 8.6% cpanellogd - updating bandwidth
mailman 3.0% /usr/local/cpanel/3rdparty/bin/python -S /usr/local/cpanel/3rdparty/mailman/cron/disabled
root 3.0% /usr/bin/perl /scripts/restartsrv_tailwatchd --check
root 1.8% /usr/bin/perl /usr/local/cpanel/bin/leechprotect
nobody 1.0% /usr/local/apache/bin/httpd -k start -DSSL
nobody 0.5% /usr/local/apache/bin/httpd -k start -DSSL
nobody 0.3% /usr/local/apache/bin/httpd -k start -DSSL

PHP seems to be having a field day...

Static content
A subpage with no images in the content other then the "default" images needed for the menu/site bg, etc.. and less "special" content, like latest news or something is on average a full second faster, but still 3,9 seconds.

Logs
I checked the (access)logs and as far I can tell, there's nothing weird in them.

Static content
I did a test with the content from my frontpage, but with only static concent AKA a basic .html file: 0,59 seconds on average load time. So the server seems te be fast enough.

Database
I run a copy of the site on a subdomain, which I use to test and it performs in the same manor as the "live" site. On the subdomain there are no database errors, so I doubt a database issue is causing all of this.

.htaccess
I will give the new .htaccess a try without any of my custom .htaccess code, see if that makes a difference.

**EDIT**
g1smd, just to make sure which version should I use?
[downloads.joomlacode.org...] right?

glimbeek




msg:4283593
 10:06 am on Mar 18, 2011 (gmt 0)

I used [webmasterworld.com...] and I tested it on my subdomain.

On average I loaded it 4,49 seconds. With the "old" version of the .htaccess file combined with 700+ lines of custom .htaccess code it loads in 4.60 seconds.

If I load the page and check what happends with FireBug -> Net -> Everything, it takes around 1 second just to process the first GET. Off which 725ms is "waiting" and 256 ms is "recieving".

At times it even takes 4~5 seconds for the first GET.

glimbeek




msg:4283606
 10:52 am on Mar 18, 2011 (gmt 0)

To answer one of my own questions about image caching:

"If a browser receives an image with the cache control headers that say the image can be considered fresh for 2 weeks, then for 2 weeks the image can be pulled directly from the browser's (or proxy's) cache on subsequent requests.This is noticeably faster than even a conditional GET and a 304 response from the server since there is no round trip. After two weeks, a conditional GET would be sent to the server to check the Last-Modified date, then again, no requests would be made for the duration of the specified freshness period."

In short it's useful and I should use it.

Trace




msg:4287549
 7:27 pm on Mar 25, 2011 (gmt 0)

I created a "best practices" type of document for the client side of things that I will gladly share if you think it can help.

It's pretty basic stuff and mostly links towards other articles on the matter (so I'll have to sticky it to you). Covers basically any http requests like image sprites, css and JavaScript.

sun818




msg:4287550
 7:35 pm on Mar 25, 2011 (gmt 0)

How about resizing images so they require fewer platter reads? If a cluster is partitioned at 4k, an image that is 5k requires two cluster reads.

I imagine this is the same for php code?

Probably not a big savings but cumulatively it has impact.

explorador




msg:4287603
 9:37 pm on Mar 25, 2011 (gmt 0)

When a CMS-driven site is slow, I always suspect the number of queries and perhaps one slow query to be the culprit.

I would also look into that.

What about my other "questions"?

I manage one site of 6,800+ pages, not joomla but I can comment on those "other questions".
Pictures.

How about resizing images so they require fewer platter reads? If a cluster is partitioned at 4k, an image that is 5k requires two cluster reads. I imagine this is the same for php code?

Probably not a big savings but cumulatively it has impact.


Sprites? yes. I went for it and it made a lot of difference. Just one request and no waiting for several images, it just loads. Of course this works only for design-layout images. The rest, no change (pictures, illustrations of the articles). Then I copied all the images to my hard drive and used diff software to reduce the size of them all while keeping X visual quality. I reduced the size of many folders this way, the whole page loaded faster, less time retrieving data. A lot of the images were "resized" via html and while being big, appeared very small.

Then converted all the JS I could into a few files, then a SRC to include them as external files. It also helped, a lot in my case as those scripts stay on the cache (just like the images).

Is there a way to cache a whole page?

After a search I see there is way to manage this under joomla. There are some posts of that on their forums.

What about plugins? perhaps one you are using is causing too much load and work? I'll turn off one by one to look for any difference.

I created a "best practices" type of document for the client side of things that I will gladly share if you think it can help.

Trace, I would be more than glad to read your document.

TheMadScientist




msg:4287611
 10:21 pm on Mar 25, 2011 (gmt 0)

Then converted all the JS I could into a few files...

That's actually a good thing to note probably, because when a browser hits a script file it loads the entire file before starting on any other files, even for a different hostname...

From the ySlow info: [developer.yahoo.com...]
The problem caused by scripts is that they block parallel downloads. The HTTP/1.1 specification suggests that browsers download no more than two components in parallel per hostname. If you serve your images from multiple hostnames, you can get more than two downloads to occur in parallel. While a script is downloading, however, the browser won't start any other downloads, even on different hostnames.

sun818




msg:4287641
 11:31 pm on Mar 25, 2011 (gmt 0)

What about asynchronous loading of JavaScript and other elements? Matt curt mentioned in a YouTube video that their analytics service is being retooled to load asynchronously.

Staggering the loading of various elements will give the perception of faster loading...

johnnie




msg:4287656
 11:51 pm on Mar 25, 2011 (gmt 0)

Have you tried a reverse proxy? mod_cache has done miracles for me.

explorador




msg:4287666
 12:52 am on Mar 26, 2011 (gmt 0)

Interesting info about JS (parallel loading and hostnames), so, now that you have a more powerful server you must be paying more money, but it is the same server. What about serving the content (queries, joomla) from one server and serving the pictures, CSS and JS from other server?

I forgot to add something. I know this is questionable but on the site I mentioned I made a list of the most important things on SEO and another list of the not important stuff (but repeated on every page), layout I mean. So, as my front page had all the links to the internal important stuff, I converted the 1. very long menu, 2. The boring static footer and 3. Some other random lists into external JS, compressed and all the blanks removed.

In my case there was a lot of difference for better. The marketing guys wanted all the menu items there (very, very unnecesary in our case). Due to the link structure there was no negative SEO impact and those JS only loaded once, visible load time only on first page, then all the others went faster.

This could also be done via AJAX, keeping in mind the separation of the important SEO stuff and the rest.

freedata




msg:4287702
 3:03 am on Mar 26, 2011 (gmt 0)

ASP.NET supports OutPutCache which is awesome. Research and see if PhP/Joomla/Apache support similar directive that can cache the output HTML (from Joomla/PhP) at the server level and re-send the cached HTML without making calls to PHP/Jooma/dB.

Key_Master




msg:4287708
 3:39 am on Mar 26, 2011 (gmt 0)

In addition to the suggestions above, I recommend checking the various log files in root/var/log for malicious connection attempts and close or limit access to nonpublic ports and services for all but authorized users only.

I'd also work on shrinking that .htaccess file.

IanKelley




msg:4287709
 3:49 am on Mar 26, 2011 (gmt 0)

Things like this are always hard to figure out without hands on info, but I have some suggestions for simple ways to rule out possible problems...

My first guess after reading your initial post was something badly inefficient in the code or the file system .

However the process and load information you posted does not seem to support this. In order for the server to take as long as you're saying it's taking to serve requests it would have to be getting bogged down. That doesn't appear to be the case.

Load on the server IS higher than it should be for the number of requests you're serving, but it's nowhere near high enough to cause requests to be queued, which is invariably what happens when a site is slow as a result of server side issues.

So I would say that Joomla, or your particular implementation of it, isn't very efficient, but that's probably not the primary problem.

The number of requests you're serving is tiny, as are the size of your page elements. You should be able to serve 1000's of times as many requests via even an ineffcient CMS on that server.

In fact you probably didn't need the server upgrade at all.

I would suggest trying this...

Create a very basic HTML page. No PHP, no ads, just a static HTML file. Put some images and javascript in if you like but nothing that contacts an external site (i.e. feeds, ads).

Now test load times using the methods you tried before on this page alone. Is it still slow?

If so the problem could be on your host's end. Either their network or a hardware problem on the server (such as the NIC card).

Run some ping tests from multiple locations and see what kind of latency there is. If you can determine that there is dramatically higher latency from certain locations then that would point to a network issue at the NOC or one of their providers.

If the test page is not slow, and pings are fast, then the problem is either in Joomla or your content.

You can rule out Joomla by creating a copy of a slow page as a static HTML file and testing it on it's own. In other words the page would have the same HTML/Javascript/CSS/Images that your CMS is outputting, but you would be serving it as an HTML file instead of through PHP.

This 52 message thread spans 2 pages: 52 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / Apache Web Server
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved