httpwebwitch - 2:51 am on Sep 3, 2010 (gmt 0)
People say, if this is your problem, it's the best kind of problem to have.
It all started when we launched some really awesome new features. Then a certain celebrity (who I WILL NOT name) started using our app, tweeted about it, and within days membership/usage has tripled. And it keeps growing...
Problem is, our MySQL database can't handle the load any more. And MY problem is, it's MY problem. Every night, between 8pm and 11pm, the load spikes so hard that the server sometimes (not always) crashes. By day we try to figure out better ways to optimize and scale this thing, and by night we're watching the "top" stats with itchy fingers on the "reboot/restart" button.
A brief summary: the data is flowing fast, stays in our custody for a while, then we delete it. But as this app has grown, the inserts are outnumbering the deletes, and the tables are growing out of control. Ratio of reads/writes is about 45/55. During peak hours, the db is handling in excess of 200 queries per second.
How do you scale MySQL?
- vertical partitioning. Keep different kinds of data on separate machines, all finetuned to handle load
- horizontal partitioning. Keep complete schemas on each machine, but split the data into chunks on each one
- Beef up the hardware
- Switch more tables over to the "MEMORY" engine. Is this a good idea?
- give up on MySQL and move everything over to something like Cassandra. Lovin the geek factor there, though I can't afford the three months it'll take to do it
How does Twitter do it?