Welcome to WebmasterWorld Guest from

Forum Moderators: DixonJones & mademetop

Message Too Old, No Replies

Tracking spider indexing from logs



11:41 am on Jul 4, 2011 (gmt 0)

Hello all,

Can anyone tell me how can I see when google bots are visiting my site, how far they crawled my site, all the pages they visited from each crawl etc.

What can I do to get that data?

I heard pwiki, crawltrack.net will do that. What do you think about these products? Are they reliable?

Someone please let me know.


12:24 pm on Jul 4, 2011 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

They are in your raw log files, often one file per day, usually in a folder accessible only via FTP. Most servers only keep the last few days data, so you need to copy or download them regularly.

I find that Google isn't so much of a "crawl" as merely pulling one URL every few minutes or so.

Featured Threads

Hot Threads This Week

Hot Threads This Month