|AJAX Calls and PHP/MySQL Performance|
Can pediodical database calls via AJAX harm my database?
| 8:35 pm on Sep 26, 2008 (gmt 0)|
Hello guys, I am brand new to the wonderful World of AJAX and I have a simple (yet long =[) question concerning calling AJAX functions pediodically.
Let's say a user visits my site, logs in, and then I want a portion of the page to update each second so my user knows if they have new messages. If they have 3 inbox messages, it will appear "You have 3 new messages" on the top of the page, but if one second later their friend sends them a message, the header will instantaneous change to "You have 4 new messages".
All I need to know is whether this will be bad for my site or not. I am trying to think about what is happening in the background, you know, behind the scenes, which means what is happening on my own Web Server. Every second there is a database call to see if there are any new messages. Imagine if I have 10,000 users logged in: that means 10,000 requests per second! I am wondering whether I should be worried about this and if so how I could optimize this and make it less "database-painful".
If you're wondering why I am asking this, it's because calling AJAX functions periodically is very important not just for me, but for any other rich web application out there: it's the best way to create a user-friendly powerful application. You can think of this as, for instance, Facebook. They have the bottom gray bar on every page that tells you went you have new Notifications and when your friends send you a new message through the Chat program. How could I be able to provide my users with a similar user experience without making my server and Database suffer due to the overload of DB calls?
I use PHP/MySQL and I plan on using AJAX periodical calls using Prototype's PeriodicalExecuter object.
So, any ideas?
Hopefully we can have a good intelligent discussion on this matter and exchange some ideas.
| 8:38 pm on Sep 26, 2008 (gmt 0)|
It isn't bad I don't think.
The only time this would be a problem is if you used this method for putting content into the page that you want indexed by search engines.
Facebook does this in about 5 seperate places on your home page and I haven't seen it crash or use up a bunch of memory on me yet.
I don't have knowledge about using Prototype's PeriodicalExecuter specifically but if they are doing things properly then it should be fine.
[edited by: Demaestro at 8:40 pm (utc) on Sep. 26, 2008]
| 8:48 pm on Sep 26, 2008 (gmt 0)|
Demaestro, thanks for the quick answer. I haven't used Prototype's PeriodicalExecuter yet either, but I will be doing some tests tonight.
My major concern is whether having 1 million users visiting my page and in each second a database call is made could harm my server, especially since I still depend on my host's servers.
And the thing you mentioned about search engines is not a big problem.
| 8:52 pm on Sep 26, 2008 (gmt 0)|
I would like to stress an interesting point here:
I googled a bit and found out that: "if you monitor Facebook.com with the console Firebug, you'll notice their ajax periodical requests run continually (as long as the AJAX timeout, default is 300sec) until an update happens, at which point it returns the JSON and starts over on another continual request."
My point is, this is amazing! For me that means I don't have to run a mysql_query() every single second for every single user. It will only be executed when AN UPDATE takes place, right?
So the question is, is there a way to do this in PHP/MySQL?
| 9:13 pm on Sep 26, 2008 (gmt 0)|
Not quite cosmo. When they say "ajax periodical requests run continually" what they mean is an ajax request that triggers some sort of server side action - something like a sql select statement.
As your traffic grows, you will run into scalability problems quickly. Especially if you have multiple such 1 second checks running. All scalability problems can be solved - the problem is they can be expensive to solve.
If you are smart about this today, you can make it easier tomorrow. Consider using a message queue that your ajax function checks rather than the main db. Also, really think about that 1 second update - does your application really need to be refreshed that often?
Also, technologys like memecache can really help your db when you do start to grow.
| 9:15 pm on Sep 26, 2008 (gmt 0)|
cos.... yes you can and even better with MySQL it is easy.
Have a look at MySQL "query caching".
What happens is MySQL will execute a query and cache the output. MySQL knows when the data has changed and will re-execute the query and update the cache as required. It uses triggers to do this..
But if you turn on query caching for these queries then Ajax will request it and it will only re-execute the query if needed.
| 9:16 pm on Sep 26, 2008 (gmt 0)|
I don't think that will work well for this application. Under his hypothetical, he has 10,000 users logged in, that means 10,000 cached queries.
| 9:38 pm on Sep 26, 2008 (gmt 0)|
stajer...true I hadn't really thought that one out in this use-case.
In that case I would create a view that has uses two fields.... the user_id or inbox_id and the count of "unread" messages.
Then just query the view, I am not sure about this in MySQL but you should be able to cache the view.... it would reduce the amounts of direct hits to the tables.
Sounds like a load test is in order to test some of this out.
[edited by: Demaestro at 9:39 pm (utc) on Sep. 26, 2008]
| 10:36 am on Oct 6, 2008 (gmt 0)|
I have been working with a client on a similar issue. They have a few thousand users with an EMail inbox type application, and AJAX that checks the database every 15 seconds to see if there is a new update. (The database does other things, of course, but this is MUCH smaller than your expected requirement)
They have dedicated hardware, and so on, but it has still required careful optimisation to reduce the impact on the database server.
They have a MostRecentMessage table with UserID + LatestMessage Date/Time of most recent message. This is a very "skinny" table, "covered" by the PK index, and in practice requires no more than two disk reads to get to any user's record. (Of course mostly the whole table is in memory anyway, but its still a "logical read").
Previously they did some sort of
SELECT MAX(PostDate) FROM UserMessages WHERE UserID = 1234
where the UserMessages table contained millions of messages, and this brought their server to its knees! (even with a UserID/MessageDate index)
The new MostRecentMessage is updated by a Trigger on the UserMessages table - when a new row is inserted the corresponding LatestMessage column is updated. (They are not using MySQL, and I'm not sure about availability of Triggers on MySQL, but there are other ways to ensure the MostRecentMessage is always up-to-date, of course)
Client has a cookie, or somesuch, with the LatestMessage value from previous call and in effect asks "Is there anything newer than 06-Oct-2008 11:26" - its a skinny question, and until the answer is "yes" it does not attempt the larger query for "Get all new messages"