Forum Moderators: phranque
I would instead think that your problem is more of a cpu limitation as your scipts grab from the database and process the output. If you are dynamically constructing each page for each request, the biggest performance boost you can probably get is caching some of your dynamic pages. That is generating a static version and refreshing it every so often.
Choice of refresh time would vary depending on how complex your page are, number of visitors, and how often the underlying data changes.
This approach gives you a lot of flexability - not just in refesh time but also in choosing which sections of your site to cache.
I don't know precisely why it helped, but I have two theories:
1. The server was reaching the limit of the simultaneous HTTP requests it could handle.
2. The server was reaching a limit on bandwidth used during some period of time, and the ISP was throttling it down.
The first thing you should try is adding more RAM to your server - as a general rule, you should try to have more RAM than the size of your database so SQL can load the entire DB into memory (which it will do eventually anyway - commonly mistaken for a memory leak, but it's not) and still have enough RAM available to do everything else. If this is possible, it will cut down on CPU performance too because the DB will be loaded into memory and won't be wasting as much CPU horsepower using swap files.
Second thing that will help - convert everything you can to stored procedures. It will run more efficiently, and that = faster.
Take a look at your table structure and make sure you're using the most suitable data type for each field. Use numbers vs. text whenever possible (including varchar/nvarchar) because those will take up more space in the DB and and take longer to search than numbers do.
Also make sure you're doing all of the little things like closing your database connection, trimming requests before they're sent to SQL server, and that you're running the latest ODBC drivers to connect to the data.
There are so many little things you can do to SQL to increase performance. I currently run a database that's about 8 GB in size, receives about 300,000 INSERT statements a day, and is constantly being accessed to both insert and display data. It runs just as well now with hundreds of millions of records in the DB as it did when I had 10 records, and that's just a dual 800 Mhz machine.
Serving every request grabs an Apache instance from the pool and keeps it busy until the last byte is sent to the user. Spending server instances fully loaded with PHP/mod_perl/whatnot on serving static content is a giant waste of memory that could otherwise be spent on larger disk cache and database buffers. And having cache and buffers large enought is often critical for good performance of DB-driven web apps.