Forum Moderators: open
I have just finished reviewing our logs for yesterday, November 4, and noticed the monthly spidering has already begun:
"Googlebot/2.1 (+http://www.googlebot.com/bot.html) 1,004 accesses 40,511,612 bytes"
Last month, October, it did this at least 8 times including a couple of days just before the dance begun. We initially thought it was just for freshness, but looking at the SERPS they were not showing up the new pages we had added (about 400 new pages). Since the update, many of the new pages are cached and our positioning on our main keywords has taken us top, but the pages still don't show up on SERPS.
Can anyone suggest what is happenning? Is it freshness or is it the regular early month spidering?
Thanks,
Paul
1.There is a modified page linked to only by 'old page'.
2.Fresh bot comes along and requests 'old page' which it already knows about.
3.Old page hasn't been changed so the server returns the 304 response code and fresh bot goes away without seeing the modified page or it's modifications.
With static html I would think new pages would always be found IF fresh bot visits the linking page since it would necessarily have a modification, the new link. Not a given with dynamically generated pages.
Of course, i really am making a barely educated guess with all this.