|% of visitors not accepting cookies|
% of visitors not accepting cookies
| 5:21 pm on Aug 17, 2004 (gmt 0)|
I installed the new webtrends today.
The report shows that 60% of our visitors dont accept cookies.
This is bad for us a cookie dependant cold fusion site.
Does anyone know their % of visitors who dont accept cookies or what the industry norms are for % of visitors who dont accept cookies?
| 5:32 pm on Aug 17, 2004 (gmt 0)|
that sounds too low.
There is little reason to not accept cookies, even text based lynx browser can use them.
You can always have your code insert a hidden session ID into any/all forms and links, but thats problematic when someone cut and pastes a url to a friend.
| 8:09 pm on Aug 17, 2004 (gmt 0)|
That sounds like a very high number to me. The first thing you need to try to do is to separate out real human beings and automated agents and then worry about what percent of the humans accept cookies.
Search engines and other robots usually do not accept cookies so if your site is setup to issue a cookie to anyone who does not already have one, each page request made by an automated agent may result in your site trying to set a new cookie.
Has your site been crawled heavily by search engines or other robots lately? Do you have any monitors that hit your site on a regular basis?
You can either get a list of known robots and try to remove them from your log files, or see if you can specify a report by user agent and then try to rerun this report for the user agents of IE and Netscape.
| 8:39 pm on Aug 17, 2004 (gmt 0)|
I generally don't accept cookies when surfing unless I know its a site I really want to see. If a site doesn't work because I won't accept cookies, I leave.
| 10:37 pm on Aug 17, 2004 (gmt 0)|
It seems really high to me to.
I was hoping to get stastistics from other webmasters to see how they compare. We get a lot of spider activity but wish we could see the true piece of the pie..
| 10:40 pm on Aug 17, 2004 (gmt 0)|
I think it's mostly that WebTrends has done a misleading job of labeling in their reports. If 100% of the pages in a visit don't have a cookie, then WT says the visitor doesn't accept cookies. Makes sense, on the surface. But if a visit consists of only one page view, and that visit is a first-time visitor, then that visit will consist of only one page with no cookie on that page, even if that visitor's browser does accept cookies. And Webtrends for some bizarred reason has decided to label all such instances "not accepting cookies". What WebTrends should do is label them "cookie status unknown."
I don't know if other traffic stats programs have this same crazy approach.
So, to get a better idea of what's going on, look at the number of single-page visits you get, the number of first-time visits, and the number of known spiders. If they're really high, that could account for a lot of your so-called cookieless visits.
| 11:15 pm on Aug 17, 2004 (gmt 0)|
|The report shows that 60% of our visitors dont accept cookies. |
eh, good old WebTrends. Anyway, here is likely happening, being a combination of "feature" in the way cookie works and possible flaw in WT to offset it:
1) Client requests the FIRST page - no cookies sent with request unless they were set before
2) Server logs request - since no cookies were sent they are not logged
3) Server returns response with newly created cookie (ie session)
4) Client drop offs without making any more page request (typical for frontpage)
So, while the first page should be logged without cookie present, the requests for embedded objects in that page (images/CSS) SHOULD have cookies logged as they were set in 4) and should have been used by the Client for all subsequent requests. WebTrends is likely to ignore those as these are not page views.
Now, cookies should be in log files for images and that can be traced back by referer to original page request, however I suspect (and too late to verify right now) that WebTrends would ignore cookies set for images and just satisfy itself that the original first request won't have cookie set, thus counting this "non-cookied" request towards "non-cookied" people.
This is theory that I will try to verify tomorrow, feel free to find hole in it and save me some time :)
| 5:47 pm on Aug 18, 2004 (gmt 0)|
If the images for the home page do get cookied, and if logging for images is not turned off, then there's a way around this --- if one of the images occurs only on the home page and, thus, can be substituted for the home page in one of your reports. Just do a search-and-replace of some kind on the log, changing the name of that graphics file to "home-graphic.html" and run the report with the real URL for the home page filtered out. (If you don't filter out the home page, then WebTrends will still see the first hit as being cookieless and you'll get the same old results.)
This probably should be done in a completely separate report just to get better numbers for cookie-disabled visitors, because of course you want to continue using the home page's referrer field info.
This should work, shouldn't it?