Forum Moderators: phranque
This method sucks from several reasons: It depends on GET to robots.txt, can't know what actual url is visited, etc.
I was wondering if there's a better way and how it can be implemented on a shared-hosting account running apache.
Thanks!
[edited by: jdMorgan at 1:03 pm (utc) on Nov. 6, 2007]
[edit reason] No personal URLs, please. See TOS. [/edit]
Alternatively, you could use a script to serve all requests from your site, after logging and sending an e-mail. In this case, the script is the "wrapper" for the pages.
In either case, you may want to take steps so that an e-mail is not sent for each and every page request, and additionally for the second method, for each image requested from your server. This will likely involve a timer and a dynamic list of recently-used robot user-agents or IP addresses.
Jim
You must include a PHP or SSI script call in each page for which you wish to log access/send an e-mail. If using PHP, you might also be able to use PHP's auto-prepend function to include the script call on every PHP page.
Jim
The PHP/SSI solutions you mentioned, requires modifying each served page!
[edited by: jdMorgan at 7:00 pm (utc) on Nov. 6, 2007]
[edit reason] No personal URLs, please. See Terms Of Service. [/edit]