do a search here for "memory leakage Access" and you'll find lots of information on why Access isn't a good database for websites.
Remember to close any connection objects that you open or the connection will stay open when you exit the page which can slow down the network.
It also helps to do a response.write on the SQL statements to see what the query looks like.
I don't know if this is connected, but lately I've noticed when trying to load ASP sites that the "icon" on the tab (status bar) shows that the page never stops downloading (even though it's viewable)..
Not my sites btw.., but I was wondering if that is being caused by an unclosed database connection or perhaps something else?
I'm using Dreamweaver so it does the opening/closing code for me, but it seemed fine about 2 weeks ago but unbearable now, I've tried changing the query around but it make no difference.
However, when tried localhost its very fast (instant) its just on the ISP server.
I use access databases on my site. They also fly on my local test bed system and slow dramatically when uploaded to the live site.
The principal reason for this is that I am using shared hosting - more than one website is hosted on the same pc. This slows down the response times of the sites, particularly in database access.
If you use a dedicated server that you might try compacting the database.
Have you tried optimizing it?
You can use the connection object to execute the query - its faster than explicit recordsets. Also use the getRows() or getString() methods for faster display of large data sets.
Also look at buffering the HTML output better, with response.flush() and using the Response.isClientConnected() method to ensure that if the user stops the page loading, the script stops executing.
12 recordsets seems like a lot to me. Is all this information dynamic or just stored in the database? If it's realtively unchanging, semi-statically publish it to include files using scheduled jobs or hard code it.
As aspdad says, using getRows or getString can greatly speed up your code. I'm not sure how good DW-generated code is, but you could also check that it is only opening the connection once, rather than for each recordset.
When Access starts having to deal with >1 user you start heading for problems. SQL Server is a better option for scaleability. HTH.
|SQL Server is a better option for scaleability. |
Or mySql if you don't want to fork over the big bucks for a SQL Server license.
mySQL works very well on Windows with ASP.
Here is a typical query:
SELECT COUNT(site1.`CAT Ratio`) AS CAT1Fail
WHERE (site1.monthentry = MMColParam) AND (site1.CAT = '1') AND ((site1.reason LIKE 'Insufficient Time To Complete') OR (site1.reason = 'Fail'))
Not that taxing the first thing I do is to filter only the month I need which cuts out 75% of the records to look at only 800 records, how else could I optimse this query?
aspdaddy said to use 'connection object to execute the query' how does that work?
That is going to slow things down - big time.
What i meant was using
set objRS = objConn.Execute (strSQL)
is more efficient than
objRS.open, strConn, strSQL, 3,1
Why would you use a LIKE without a wildcard? In that case, just use "=".
If you can eliminate all the LIKEs, then make sure your text fields in the WHERE clause have indexes on them or you force a table scan.
for your reply, I removed the LIKE to = and indexed the relevant fields - its faster now!