Forum Moderators: open
This behaviour would be typical of Googlebot checking only the pages it already knows about for updates. It may note the new links and come back for them later, or perhaps a separate 'bot will visit with the express purpose of looking for new links and pages.
A few years ago, we called the behaviour you observe the "Freshbot" function, where Google was looking for updates only on pages that it had previously indexed. This was a daily-to-weekly function, depending on the PageRank of the pages. Then, approximately once a month, we'd see the "Deepbot" which would spider almost every page it could find. So, Freshbot was intended to keep the cache and index of previously-indexed pages fresh, and Deepbot was used less frequently to discover new pages. It's quite possible that we're seeing a return to this behavioural pattern.
The solution? Wait up to 90 days and then check again. If the new pages aren't showing up by then, you can be sure something's wrong.
Jim
I have always been deep crawled about once a month until this most recent problem I've alluded to.
However, I'm wondering if the recent increase in the size of the Google database has something to do with my site being deep crawled less often.
Essentially, the number of SERPs has more than doubled for my keywords. Maybe Google hasn't quite doubled the number of bots.