Forum Moderators: phranque

Message Too Old, No Replies

What is causing our ecommerce server downtime?

IIS Server being brought to its knees

         

upwordz

2:10 pm on May 18, 2004 (gmt 0)

10+ Year Member



We are running IIS 5.0 with Windows 2000 server on dual 2.4 GHz Xeon Processors with 1 GB of RAM and a 10 Mbit half duplex network connection on a dedicated box hosted with Verio Hosting.

Over the past year we upgraded to this machine because about once a month we would encounter down time and severe lag for a period of minutes and or hours on and off. Verio confirmed that traffic / bandwith was not the problem.

We were able to see in the task manager that our ecommerce application Softcart (from Mercantec) was eating up all the available RAM. The program runs on demand when a visitor uses the navigation on our site, starts a shopping cart or checks out. However, the application usually processes each request and then closes. It appeared during times of severe lag that the processes were not closing. Since we rewrote our URLS about two years ago which removed the application out of the URL, we now use Perl scripts to talk to the Softcart application. So, for each instance of the softcart application that we saw in the task manager, we also saw an instance of Perl. One of the theories that has some evidence to support it is that we have noticed search engine bots present during several of the lag periods. Since our site is dynamic and relies on Softcart to access the flat file database of more than 6,000 products, we thought it was possible that the bots were overwhelming our server. Whether it was Softcart, Perl or the processor we didnt know. So we upgraded to the server as described above and the problem has not gone away (though it is still possible that we need a faster server). We did have Verio test opening our connection to 100 Mbit full duplex, but the problem did not go away. We have since instituted a robots.txt file to keep the robots from visiting our site and for the past two days that has appeared to work. However, we obviously want to be a bot friendly site and Google has indexed most of our pages at this point so at least they have found a way to not take our site down.

We are currently trying to find some process management software that will allow us to track memory allocation and running time to all the processes on our server. If anyone has any software suggestions or suggestions of any other kind please let us know.

txbakers

3:00 pm on May 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi and welcome to the WebmasterWorld!

flat file database of more than 6,000

I think that's your problem right there.

It might be time to install a real RDMS like mySQL to handle the dynamic building. Flat files are really a drag on a system.

upwordz

5:36 pm on May 18, 2004 (gmt 0)

10+ Year Member



Thanks Tx,

Our programming consultant has advised the same course of action but for different reasons. When we approached our management team with the cost to implement, we were politely shown the door. We have a very small staff with no experience in setting up MYSQL so we would have to have our consultant do all of that work.

We are hopeful that installing better performance logging will enable us to determine where the bottle neck is exactly occurring because this problem appears to only occur when bots hammer through our dynamic pages and start activating our ecommerce application. We specifically buried our add to cart and show cart links in javascript links so we could hide our shopping cart application from the search engines, but they are finding their way in anyway.

Appreciate your thoughts Tx.

txbakers

5:49 pm on May 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



With what it sounds like you are doing, you can have mySQL up and running, with the database populated, in an hour or two.

You don't need a consultant to do it for you.

The bottleneck is most likely the flat file access. If you were to use Access as the database, you'd see an improvement as well.

But you will need to get rid of the flat file and put it into a database.

jpjones

5:53 pm on May 18, 2004 (gmt 0)

10+ Year Member



So basically SoftCart is killing the server due to what its level of processing.

Have you looked at using some sort of caching system between the front end and the back end?
Qs to ask:
How often does the data change (minute/hourly/daily/weekly etc)
How much of the SoftCart-generated content can be cached e.g. not the baskets but general product information?

Depending on how you have created the perl script, and wether or not there is session data in the pages, could you adapt it to store each page it requests into a cache? Therefore, on each request, you check the cache to see if the page already exists. If so, simply read the page out to the browser. If not, put the request through to SoftCart, and then store the output before passing it through to the browser.

But as txBakers says, the most likely problem is the flat file. Every time you make a request, the application will have to read through the file, and that puts mucho LOAD on the server.
JP

upwordz

6:12 pm on May 18, 2004 (gmt 0)

10+ Year Member



Hey thank Tx and Jp, appreciate your thoughts alot. Let me ask you guys, where would we go to get started with buying and setting up MYSQL (can mysql import a tab delimited txt file?), there certainly would be some programming required to get softcart to read from mysql instead of its native flat file wouldnt you say? And Jp, do you use a seperate program to set up the caching or is that something that IIS is capable of doing already? The database changes daily and sometimes multiple times per day.

On another note we are thinking that we can stop robots from reaching softcart with:

User-agent: *
Disallow: /path/to/softcart

And that may get us by in the meantime.

Thanks again you guys.

DaveAtIFG

6:17 pm on May 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Might be worth looking at which bots are causing problems. MSN is reported to be excessive on many sites and it's usefulness is questionable until MSN actually deploys an SE.

txbakers

6:31 pm on May 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



where would we go to get started with buying and setting up MYSQL (can mysql import a tab delimited txt file?), there certainly would be some programming required to get softcart to read from mysql instead of its native flat file wouldnt you say?

You can go to [mysql.com...] and download the database. They changed their licensing deal - it used to be free, but now they have a two-tiered system. If it's for redistribution you might have to pay something. Just take a read through.

Installation is fast. There are tons of help files available in most languages to get Perl to connect to the database. I do it in ASP.

upwordz

7:22 pm on May 18, 2004 (gmt 0)

10+ Year Member



Good point Dave, we are pretty sure that MSN and Yahoo were both on at the same time on our last downtime event and that Google has found a friendly way to work with our server but it also interesting to note that Google hasnt been able to get ALL our pages, only a much larger number that continues to slowly creep up as the weeks and months go by.

And Tx, we are downloading all the materials now. Still wondering how much deep water lay between softcart and mysql and how much perl development we will need to write in order to make all the magic happen.

Appreciate all the help!

upwordz

1:40 pm on May 19, 2004 (gmt 0)

10+ Year Member



We did some more thinking on mysql and at least from a set up stand point, we will have to update a dozen or so scripts which rely on our flat file. Before we rush in that direction, we are going to write a test script which measures the time searches take on our flat file and compares that to searches on smaller and smaller copies. Depending on how much of a performance hit we see, it will speak to the urgency of looking for a faster performing product file. We are also testing a bot script we are writing to see if we can simulate the server lag. We want to put some process monitoring in place on the server to log process times, memory allocation etc. but we are having problems getting something set up. We tried:

[sentry-go.com...]
[hyperic.net...]
[wavelengthtech.com...]
[snapfiles.com...]

But each one has had its own complications. If anyone has had experience with setting up something like this, please let us know. We appreciate all your thoughts and ideas!