Forum Moderators: phranque
Over the past year we upgraded to this machine because about once a month we would encounter down time and severe lag for a period of minutes and or hours on and off. Verio confirmed that traffic / bandwith was not the problem.
We were able to see in the task manager that our ecommerce application Softcart (from Mercantec) was eating up all the available RAM. The program runs on demand when a visitor uses the navigation on our site, starts a shopping cart or checks out. However, the application usually processes each request and then closes. It appeared during times of severe lag that the processes were not closing. Since we rewrote our URLS about two years ago which removed the application out of the URL, we now use Perl scripts to talk to the Softcart application. So, for each instance of the softcart application that we saw in the task manager, we also saw an instance of Perl. One of the theories that has some evidence to support it is that we have noticed search engine bots present during several of the lag periods. Since our site is dynamic and relies on Softcart to access the flat file database of more than 6,000 products, we thought it was possible that the bots were overwhelming our server. Whether it was Softcart, Perl or the processor we didnt know. So we upgraded to the server as described above and the problem has not gone away (though it is still possible that we need a faster server). We did have Verio test opening our connection to 100 Mbit full duplex, but the problem did not go away. We have since instituted a robots.txt file to keep the robots from visiting our site and for the past two days that has appeared to work. However, we obviously want to be a bot friendly site and Google has indexed most of our pages at this point so at least they have found a way to not take our site down.
We are currently trying to find some process management software that will allow us to track memory allocation and running time to all the processes on our server. If anyone has any software suggestions or suggestions of any other kind please let us know.
Our programming consultant has advised the same course of action but for different reasons. When we approached our management team with the cost to implement, we were politely shown the door. We have a very small staff with no experience in setting up MYSQL so we would have to have our consultant do all of that work.
We are hopeful that installing better performance logging will enable us to determine where the bottle neck is exactly occurring because this problem appears to only occur when bots hammer through our dynamic pages and start activating our ecommerce application. We specifically buried our add to cart and show cart links in javascript links so we could hide our shopping cart application from the search engines, but they are finding their way in anyway.
Appreciate your thoughts Tx.
You don't need a consultant to do it for you.
The bottleneck is most likely the flat file access. If you were to use Access as the database, you'd see an improvement as well.
But you will need to get rid of the flat file and put it into a database.
Have you looked at using some sort of caching system between the front end and the back end?
Qs to ask:
How often does the data change (minute/hourly/daily/weekly etc)
How much of the SoftCart-generated content can be cached e.g. not the baskets but general product information?
Depending on how you have created the perl script, and wether or not there is session data in the pages, could you adapt it to store each page it requests into a cache? Therefore, on each request, you check the cache to see if the page already exists. If so, simply read the page out to the browser. If not, put the request through to SoftCart, and then store the output before passing it through to the browser.
But as txBakers says, the most likely problem is the flat file. Every time you make a request, the application will have to read through the file, and that puts mucho LOAD on the server.
JP
On another note we are thinking that we can stop robots from reaching softcart with:
User-agent: *
Disallow: /path/to/softcart
And that may get us by in the meantime.
Thanks again you guys.
where would we go to get started with buying and setting up MYSQL (can mysql import a tab delimited txt file?), there certainly would be some programming required to get softcart to read from mysql instead of its native flat file wouldnt you say?
You can go to [mysql.com...] and download the database. They changed their licensing deal - it used to be free, but now they have a two-tiered system. If it's for redistribution you might have to pay something. Just take a read through.
Installation is fast. There are tons of help files available in most languages to get Perl to connect to the database. I do it in ASP.
And Tx, we are downloading all the materials now. Still wondering how much deep water lay between softcart and mysql and how much perl development we will need to write in order to make all the magic happen.
Appreciate all the help!
[sentry-go.com...]
[hyperic.net...]
[wavelengthtech.com...]
[snapfiles.com...]
But each one has had its own complications. If anyone has had experience with setting up something like this, please let us know. We appreciate all your thoughts and ideas!