Forum Moderators: DixonJones
The tools and way data is collected at work is causing a lot of problems and is not accurate, so this would be the idea way to know for certain how many referals we recieve from certain URL's.
The raw logs are very large - over 20GB and i would need an option to set the queries by date range.
Any help appreciated.
For example i could query the following URL for a 4 day period,
[google.co.uk...]
And know how many times someone reached the site from Google UK using the search term widgets.
awk -F\" '{print $4}' combined_log ¦ sort ¦ uniq -c ¦ sort Should be easy enough to pipe that into an SQL database for analysing, but much easier to have that information in SQL to start with - using server-side scripting to record referer information and query strings.