Forum Moderators: open
The deepbot is collecting information that will be used for the monthly Google update. It should spider all the pages off your site and you would expect that it spiders each page only one time in this one-month interval. On an existing site, Freshbot will only visits some of the pages. It will especially visit those pages that changed in the past or the newly added pages. It could come back to visit the same page several times in a one-month interval. This information is used to keep the SERPs fresh with new information. On the SERP you can see a date printed next to the URL for some days. Recently there are signs that this information is sometimes also used for the Google update.
Why is the sky blue? [why-is-the-sky-blue.org]