homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Visit PubCon.com
Home / Forums Index / Code, Content, and Presentation / Content Management
Forum Library, Charter, Moderators: ergophobe

Content Management Forum

How large of a Joomla site before problems ?

 12:01 am on Mar 20, 2013 (gmt 0)

I was wondering if anyone had experience with very large sites in Joomla 2.5.

In general, at what point in terms of the number of articles and database file size will you start to have problems.?

I realize that there are a lot of variables such as the server, server configuration, joomla configuration, etc.

I would appreciate any thoughts on this.





 2:04 am on Mar 20, 2013 (gmt 0)

:: peering into crystal ball ::

By and by, g1smd or someone like him is going to come along and say that it all depends on which version of joomla's htaccess you're using and how much you've modified it. (This applies to any CMS that relies heavily on rewrites.) Things that worked fine when you had 100 people a day reading a total of 300 articles may no longer work so well when you've got 10000 people an hour searching through half a million files.

travelin cat

 2:44 pm on Mar 20, 2013 (gmt 0)

We had travel site using Joomla 1.5 years ago that had over 11,000 pages including a forum. Never had a problem with an average of about 2k visitors a day.


 3:10 pm on Mar 20, 2013 (gmt 0)

>>which version of joomla's htaccess

Let's not get carried away here. I've had this discussion with Jim Morgan and I would say this is typically a minor effect unless you have a very large number of resources on the page and your site is otherwise I/O constrained. It is not typically the issue that will bring your site to a crawl.

In my experience with Wordpress and Drupal (and Joomla should be no different), what is going to really kill you will be

1. Very expensive queries
2. Fairly expensive queries that are repeated over and over
3. Expensive function calls that get repeated over and over
4. Front-end load - heavy Javascript in particular.

Using the default inefficient rewrites will typically be a fairly small hit and a late-stage optimization. That isn't to say that you can't have a rewrite that starts with .* that could create havoc in some situations.

Generally, how much adding more "pages" to a site will affect performance will be a function less of the number of pages than of how many tables need to be queried and how complex the joins are.

If you're querying a few tables - a URL lookup, a content lookup, and user/permissions lookup - this will be super fast. Those queries will be on indexed columns and in the case of the user and content lookup, it will match on an integer typically. Lightning fast even with many thousands of users and pages.

If you add one plugin to your site, though, that adds another table of data and dramatically complicates the query, it will take a significant hit.

And in terms of function calls - this can be a real eye-opener if you do some profiling with Xdebug or similar. I had a site where I was using PHP calls to getimagesize to inject proper height and width attributes into the HTML IMG tags. The pages were loading slowly (these were gallery pages that might have 50 images) and it turned out that 80% of the processing time was being spent just getting image sizes.

So in that case, there was a dramatic increase in server efficiency on the PHP end and a tiny decrease on the MySQL end if I grabbed the size data on data entry and stored it in the database with the file path (which I was storing anyway, so just adding two columns to an existing table) rather than generating it on the fly upon output.

Okay, none of that really addresses your question. The main point I'm trying to make is that you can't take a number of pages and say "At this size database, things will slow down." The "other variables" you mention tend to dramatically outweigh just the size of the database.

I'm not sure if you can do this with Joomla, but with Drupal you can use the Devel module and generate fake content for your site and then load test it with zillions of pages under concurrent load and see when it fails. I would imagine you might be able to write a simple script that would do the same for Joomla - autopopulate your DB. Keep adding 10,000 pages and 1,000 users until things grind to a halt


 3:14 pm on Mar 20, 2013 (gmt 0)

One last thought - if you put a Varnish or Squid reverse proxy in front of your site, you will also dramatically change the picture.


 12:23 pm on Mar 21, 2013 (gmt 0)

Thanks everyone for the replies. That makes things much clearer. I'll take a close look at the Plugins and modules and do some more benchmarking for future reference

Global Options:
 top home search open messages active posts  

Home / Forums Index / Code, Content, and Presentation / Content Management
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved