Forum Moderators: open
maybe it does already exist:
An job that analyses the logs of the webserver, and sends a summary to a central coordination host, which then boils up some info.
This could then answer questions like:
'how active was freshbot'
'has deep bot been seen'
etc etc.
Implementation:
perl I guess on the clients
Configuration:
level of detail for traffic amount, number of google referals, (even terms if the webmaster likes to), different bots as in agent identifier and IP.
Frequency of updates.
All this data would be sent to a server that boils it up and presents it to people that have contributed to this.
Biggest problem would not be the implementation, but for people to understand about the benefit and to have the trust in this potential
'spyware'. Hence the implementation in perl: Its inheritably open ...
Maybe I make sense? [for a change]
BTW are you talking about implementing this for more than your own sites, like a central system?
What I had in mind would also answer questions like:
"Anybody noticed that google referals are down"
since it would show that over a collection of sites.
Kind of like a 'data club':
people have what they consider to be not sensitive fed into a
central bookkeeping database that is then accessible for them.
I saw many posts if the google bot had been active, how much etc.
Everybody uses different tools. And the interpretation is manual.
The concept of a 'log data club' would provide more meaningful data.
even internet outages etc could be spotted and logged.