Welcome to WebmasterWorld Guest from

Forum Moderators: DixonJones & mademetop

Message Too Old, No Replies

Tracking spider indexing from logs

11:41 am on Jul 4, 2011 (gmt 0)

New User

5+ Year Member

joined:July 4, 2011
posts: 1
votes: 0

Hello all,

Can anyone tell me how can I see when google bots are visiting my site, how far they crawled my site, all the pages they visited from each crawl etc.

What can I do to get that data?

I heard pwiki, crawltrack.net will do that. What do you think about these products? Are they reliable?

Someone please let me know.
12:24 pm on July 4, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
votes: 0

They are in your raw log files, often one file per day, usually in a folder accessible only via FTP. Most servers only keep the last few days data, so you need to copy or download them regularly.

I find that Google isn't so much of a "crawl" as merely pulling one URL every few minutes or so.