Welcome to WebmasterWorld Guest from 126.96.36.199 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Pubcon Platinum Sponsor 2014
- So I wrote a similar logging system using PHP and MySQL.
It captures all that missing traffic, but has an unexpected downside. When I compare the PHP logs to the JS logs, I see that the PHP logs miss out all repeated page visits from the same visitor.
I realised that this is probably because those pages are in the client browser's cache - and so no request to my server is made, and so no PHP log entry is made.
Is there a nice way around this?
Thanks for any light you can shed.
There are a couple meta tags that can be useful it preventing client-side caching:
<META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
<META HTTP-EQUIV="PRAGMA" CONTENT="NO-CACHE">
should prevent the browser from caching the page. While it's not foolproof, you should see an improvement in your numbers.
Thanks for your repsonse Chad.
Presumably, the downside to this is that my visitors lose the benefits of caching and see a slower browsing experience, and my server sees more load and bandwidth.
Yeah, I know I want to have my cake and eat it, but you can't blame a guy for trying.
Not all that spooky. It's probably some proxy that's filtering it from the HTTP-Requests, plenty of so called "privacy", "anti-spam", "anti-virus" programs like to do that kind of stuff. Since JS is executed directly in the browser, it does have access to the Referrer.
Thanks for your expertise, ruserious, that makes perfect sense.