Forum Moderators: open
One of my pages was last indexed on Dec 10th. I got an alert that showed that the spider crawled on my page on Dec 31st. But the page still hasn't been indexed after 31st. Does google index my pages every time it crawls?
I have it setup so that it checks the user-agent and categorizes the impressions. Then I can look at a table throughout the day that shows page views by users, Googlebot, Ink, etc. Seems the easiest, and least intrusive way.
I guess you need PHP for this to work?
My second guess: it only works when Googlebot follows the link?
It looks like that for the other solution. And then you also don't know when google spidered your site, just when google spidered their site, which was linked from yours. I'd go for a php/perl/cgi/... solution on your server. Then there's no doubt and it should work very reliable.
Put this in a file called test.php as a text file:
<?php echo("HELLO");?>
Upload that file to your webserver and use your browser to look at that page.
If you see just HELLO then PHP is working.
If you see <?php echo("HELLO");?> then it is not.