| Welcome to WebmasterWorld Guest from 220.127.116.11 |
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
|Pubcon Platinum Sponsor 2014|
|Tracking spider indexing from logs|
Can anyone tell me how can I see when google bots are visiting my site, how far they crawled my site, all the pages they visited from each crawl etc.
What can I do to get that data?
I heard pwiki, crawltrack.net will do that. What do you think about these products? Are they reliable?
Someone please let me know.
They are in your raw log files, often one file per day, usually in a folder accessible only via FTP. Most servers only keep the last few days data, so you need to copy or download them regularly.
I find that Google isn't so much of a "crawl" as merely pulling one URL every few minutes or so.
All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved