Is it possible to create a tracking script to capture the following:
· report shows which search engine robots retrieved which pages on which days
· report to demonstrate potential problems, such as when particular pages are never indexed by any robot. With this information we can revaluate pages and optimize differently or can re-submit "problem" pages to specific search engines.
I have something that shows what a given robot has fetched today in alphebetical order, and furthermore gives counts of total files fetched, unique files fetched. Its a basic BASH script that uses a lot of the unix command line functions (ie it will work on a unix or linux server only) if your interested let me know.
[edit -added] To clarify here analizes the logs, it is not a server side include or anything that looks at visitors real time.
I was in a session at SES NYC where the moderator recommended a product called Robot Manager for some of the things you mention. I haven't used it, but she said it helped their company get more information on spider activity.