Welcome to WebmasterWorld Guest from 18.104.22.168
Search engines and other robots usually do not accept cookies so if your site is setup to issue a cookie to anyone who does not already have one, each page request made by an automated agent may result in your site trying to set a new cookie.
Has your site been crawled heavily by search engines or other robots lately? Do you have any monitors that hit your site on a regular basis?
You can either get a list of known robots and try to remove them from your log files, or see if you can specify a report by user agent and then try to rerun this report for the user agents of IE and Netscape.
I don't know if other traffic stats programs have this same crazy approach.
So, to get a better idea of what's going on, look at the number of single-page visits you get, the number of first-time visits, and the number of known spiders. If they're really high, that could account for a lot of your so-called cookieless visits.
The report shows that 60% of our visitors dont accept cookies.
eh, good old WebTrends. Anyway, here is likely happening, being a combination of "feature" in the way cookie works and possible flaw in WT to offset it:
1) Client requests the FIRST page - no cookies sent with request unless they were set before
2) Server logs request - since no cookies were sent they are not logged
3) Server returns response with newly created cookie (ie session)
4) Client drop offs without making any more page request (typical for frontpage)
So, while the first page should be logged without cookie present, the requests for embedded objects in that page (images/CSS) SHOULD have cookies logged as they were set in 4) and should have been used by the Client for all subsequent requests. WebTrends is likely to ignore those as these are not page views.
Now, cookies should be in log files for images and that can be traced back by referer to original page request, however I suspect (and too late to verify right now) that WebTrends would ignore cookies set for images and just satisfy itself that the original first request won't have cookie set, thus counting this "non-cookied" request towards "non-cookied" people.
This is theory that I will try to verify tomorrow, feel free to find hole in it and save me some time :)
This probably should be done in a completely separate report just to get better numbers for cookie-disabled visitors, because of course you want to continue using the home page's referrer field info.
This should work, shouldn't it?