Forum Moderators: open
My sites have been getting larger and there is obviously a bigger load being put on it, however before I go to considerable expense of a new server (it is hosted with one of the big guys and and is a rather bigger proposition than some of the £60 servers I see these days) I wanted to somehow check the load both the webserver and the SQL server is putting on the machine and if possible what SQL querys are running at a particular time to make sure all the ASP code is closing everything properly etc.
Anyone got any tips on what to do?
As for checking your database code efficiency, look into SQL Profiler's Trace capabilities:
[developer.com...]
Thanks for the Profiler link, will check it out, have Remote Desktop to the server so can do pretty much what I want.
Couple of questions...
How are you generally making your db connections?
What type of ado cursors are you using?
Are you passing large recordsets or arrays down the wire back to the web server?
I've seen something similar with 2 physical machines where the entire site was db driven and making a general conneciton via dsn inside the site template and closing the connection inside the closing template. However, just before closing the connection the last bit of code wrote a simple user traffic footprint into an archive. Once the footprint archive table hit 1M rows that application started to just stall and eventually lose connection.
It got really bad when a crawler came along... the connections couldn't get back to the pool fast enough so the application was just opening more and more. Once the connection count hit 50 then boom... no more connectivity. The only recovery was bouncing iis or the box.
To allevaite the issue I ended up running a daily package to move the footprint data to another repository.