Welcome to WebmasterWorld Guest from 188.8.131.52
"Response to client clicks should be no slower than 5 seconds maximum. Server request-to-response times should be recorded accurately, and presented on a regular basis."
Basically, I'd like to create a monitor for this using PHP, as I imagine that it might be the best language to use (possibly). I would like the script(s) to work in two ways;
1. Log the date, time and IP of each request-to-response activity in a file (a simple database or even just a text file). This log file would then be interrogated, on a regular date, to create the required report.
2. Show these stats, live, in a back-office page of the main site (or maybe even from the front, as one of those "current active users" blurbs).
Could anyone shed some light on which script tags I should be working with for this? All comments greatly appreciated - deadline for this feature is quite tight...
There are some excellent feedbacks on how to track running time of scripts and response times of server.
Here is a direct link to the knowledge base : [php.net...]
You might also want to look at exec() so you can "ping" the client at least once for connection speed. A 5 second response time requirement is nice, but what about latency that you should not be responsible for? If I have a 14.4K connection your latency can be has high as 2 even 3 seconds... What about traffic-jam in between visitor and server?
I use Mozilla under Linux for development. Using microtime to time my scripts (db access, etc), the times would be sub-second, but it would take seconds to render in Mozilla.
Using other faster browsers (IE, Opera, Konq) would make the site really responsive.
The project brief is actually quite vague, and while I have tried to get firmer information with regards to exactly which response time they need, none of the contractees actually seems to know anything about it(!!!).
All of which is very irritating, and proving to be a major headache to spec at this end.
Basically, for the moment I'm only focussing on server-side processing times, as I'm certain the client doesn't understand what they're really asking for outside of this.
Each page in this new project will rely on dynamic template header and footer includes. I was thinking that I could create a counter in the header, tag it with the session ID, and insert the current session details (e.g. YY/MM/DD, HH:MM, URL, Referer, etc) into a text file (etc.), then immediately start the counter.
Then the rest of the page is displayed, the footer is loaded, which includes a stop command for the counter. The counter result is then output in relation to the information taken via the header (date, time, etc.) and appended to the relevant entry.
1. Would this be worth doing, with regards to how much it might slow down the server under heavy load?
2. (This may be redundant, but) How accurate would this be when considering heavy server load/stress?
Just as a point of reference regarding accuracy, I was thinking of setting the timer to record 10ths of a second (rather than milliseconds).
A counter logger at the very top of the page, and a counter logger at the very bottom of the page.
Then compare the two. You will have major differences (5%-20%) in the 2 counters sometimes. It is a very good indication of box speed. I usually use an ssi counter as it will be the most accurate. A graphic counter will not work because browsers don't download graphics in the order they are on the page.
I've also tried putting a graphic at the very top of a graphics intensive page and one at the very bottom. Then compare the pull times for all the graphics on the page (find the lowest and then find the highest).
As for specific script tags, just general PHP will do.
- Read the environment variables. (get referrer, time, browser, page viewed...etc)
- Append data to the disk log file.
It's nice to know I'm not (completely?) bonkers.
The actual time taken won't need to be made public, although I do like the graphics idea - a very nice visual yardstick I think. It just needs to be recorded for "back office"-style performance reports each week/month.
So I will go with the header/footer idea. I'm assuming of course that my "startCounter" and "stopCounter" need to be the very first, and very last, line of the header and footer respectively, to garner the most accurate results?
For those of you interested, a colleague and I have completed the code to record request processing times. If you're interested, you can sticky mail [webmasterworld.com] me for the code. I (obviously) won't provide support for this, but you're welcome to muddle through it at your leisure.
Nothing more easy if you use MSIE and you have some C++ programming skills.
MSIE fires events when the page is requested and when the transfer is finished, take timestamps and you are OK.
How usefull would be this kind of client side tool ?
One other way to demonstrate the performance of a site but eliminate client factors is to place page requests from a variety of different locations at about the same time, and track the time to deliver the page. I used a free service at www.tracert.com to demonstrate to a web host that their server had a problem. They speculated that slow click response times were due to some weird problem on my end - my ISP, my PC, my browser, etc. (Naturally, they alleged that mine was the ONLY site experiencing any problems.) When I documented that pages were slow (or timed out) from ten different locations around the country, they got serious and fixed the problem. The same test, of course, could be used to demonstrate that a site is working properly. I don't think this is exactly what NetGrease's client is looking for, but it's a nice "real world" test that could be used to support the locally calculated load times.