Forum Moderators: DixonJones
However I am trying to get at an estimate of how big caching really is
Having spent several hours with Google searching I came up with a couple of articles
[internetnews.com...]
and
[pint.com...]
The distilled wisdom would appear to be in
For instance, according ABCi, caching and Web spider activity can cause logfile figures to be as much as 30 percent off, if publishers aren't using one of the major analysis or auditing services. (Jupiter, similarly, reports that caching can underreport site hits by as much as 45 percent.)
there is another commercial reference
[bellacoola.com...]
which suggests 40% for cached traffic
Question - does the figure of around 30% to 40% for cached traffic sound reasonable?
Question - Does anyone have a better idea or source?
Which is clearly written for a tech article :)
However, caches don't cache near what they used to since the widespread adoption of the repressive HTTP 1.1 specification on caching. It rewrote the rules of caching into a confusing cluster of obtuse and often contradictroy rules.