| Welcome to WebmasterWorld Guest from 188.8.131.52 |
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
|Become a Pro Member|
|Tracking spider indexing from logs|
| 11:41 am on Jul 4, 2011 (gmt 0)|
Can anyone tell me how can I see when google bots are visiting my site, how far they crawled my site, all the pages they visited from each crawl etc.
What can I do to get that data?
I heard pwiki, crawltrack.net will do that. What do you think about these products? Are they reliable?
Someone please let me know.
| 12:24 pm on Jul 4, 2011 (gmt 0)|
They are in your raw log files, often one file per day, usually in a folder accessible only via FTP. Most servers only keep the last few days data, so you need to copy or download them regularly.
I find that Google isn't so much of a "crawl" as merely pulling one URL every few minutes or so.
All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved