|Googlebot stopped crawling new site section, crawls another just fine?|
About two weeks ago I added a blog-like section to a site, providing newsy and background articles on topics related to the niche. I've been adding new content to it since then, at a pace of about one page per day. Soon enough, Googlebot came along and crawled the first handful of pages, and indexed them within 24-48 hours.
Odd thing is that after the publication of the sixth article, Google stopped crawling the section entirely. For over a week now, only the first five articles are indexed and ranking. I still get frequent Googlebot visits in other (older) sections. Last week I even added another new section, and that one has also been fully crawled and indexed.
The sixth article in the blog-like section was one that contained a relatively large amount of links compared to the other articles. It's a post with background information on a specific topic in the niche, along with a list of some 50 helpful and related resources. Google crawled that page a couple of times, and wasn't seen in that section since.
The only other thing I can think of is that it was also the first page I posted to the newly created social media accounts (Facebook & Twitter). Nothing spammy, though.
Site's about a year old, ranking healthily and traffic is steadily increasing.
What should I make of this?
Did you run a fetch as Googlebot from GWT? If you have access to Screaming Frog, you can also crawl the site as Googlebot, just to rule out technical issues.
Well, that's weird. I fetch the missing pages as Googlebot from GWT as you suggested, then when I click "Submit to index", the page is immediately indexed. And ranking.
I can repeat this with any new page. Fetch, submit, and a site:example.com/new-section search shows previous results plus one. Is that normal?
Anyway, apparently it can fetch and index the pages just fine, so what could make it decide against it?
I've now also had the section's homepage re-indexed. Index and cached version were immediately updated. Curious to see if that will change the crawling pattern.
Just tried it with a new page in another section. It was also indexed straight away, so I guess this is normal behavior, and also what it says on the box, of course ("Submit to index"). I just wasn't expecting it to have an immediate effect on the SERPs.
It makes it look as if all the pages are already there in the Hidden Index, and by some error they were omitted from the visible index. Put in a manual request, someone kicks the computer, and instantly things go back where they're supposed to be.
Crawling of the section has been "rebooted", apparently. Article posted today was crawled within half an hour -- lucky, perhaps. Not yet indexed, but that's okay.
Haven't had any manual visits from the Googleplex, so I guess Googlebot's off the leash in making these decisions. Or it could've been a fluke.