Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google frequency-of-crawling issues

         

placedigit

6:04 am on Dec 16, 2019 (gmt 0)

5+ Year Member



One of the website that work for has been showing some unexpected signs in GSC.
This website has more than 200 pages out of which Google has been crawling only 5 pages on frequent basis but rest of the pages are not been crawled by Google since last 20 days.

Please not I have submitted all the URL that I want Google to read and crawl in the sitemap.

You can access the screenshot of the GSC in the link below.
<snip>

Can someone suggest what should be done so that Google crawl the pages properly and frequently.

engine

11:24 am on Dec 16, 2019 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Mod note, I removed the screenshot as it doesn't show anything to help.

engine

4:43 pm on Dec 16, 2019 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



It's the rate Googlebot has for your site.
It's worth noting that Google doesn't offer any guarantees about crawl rate, and often the rate is decided upon a range of factors, which you can't really easily influence.

Terabytes

5:52 pm on Dec 16, 2019 (gmt 0)

10+ Year Member



Have those pages been "stagnant"? No changes for a long period of time?
I'm sure G doesn't waste time frequently re-indexing pages it assumes are not changing. What would be the point in re-indexing the already indexed page if it never changes?

I'm not saying that's your issue, just asking if that may be an issue..

Robert Charlton

10:18 pm on Dec 16, 2019 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Can someone suggest what should be done so that Google crawl the pages properly and frequently.
Best answer is probably to improve the quality and uniqueness of your pages to attract more and better natural links. "Properly" is Google's decision, not yours.

Over the last couple of years, Google has put major limitations on how many sites and pages submitted manually it will consider. Various tools, like the Fetch tool, were being misused by spammers, and also by third party "SEO" submssion services, to the degree that they were creating a considerable load on Google's resources. Google found it necessary to impose severe limitations on manual submssions, and has made it clear that it wants to keep manual submissions for "emergencies" only.

While the preferred method of manual submission now, if I remember correctly, is via sitemaps, Google makes no guarantees it will index all pages submitted.

Google does regularly crawl the web looking for pages it considers worth indexing, and while the exact algorithms are complex and secret, I'm sure they include the number and quality of inbound links, and the quality of content... how unique it is, and how useful. I'd guess also that Google keeps some historical notes on the quality of previous submissions from various sources. Very probably, the quality of the "network" with which the pages might be associated is also considered.

The following thread, from early 2018, presents a fairly complete overview of Google's policies and how the evolved, and is worth a read. I suspect, if anything, that the guidelines have only gotten stricter.

Big reductions in crawl-to-index limits on Google Fetch tool
March, 2018
https://www.webmasterworld.com/google/4893740.htm [webmasterworld.com]

OldFaces

11:43 pm on Dec 16, 2019 (gmt 0)

10+ Year Member Top Contributors Of The Month



We've always struggled with understanding gbot's crawl rate. It fluctuates HIGHLY for us which makes dev ops a pain sometimes. I mean, a serious range. We can go anywhere from 50-100pings per minute upwards of 4,500. It doesn't necessarily match either when last mod is updated or url changes. Just seems very irregular to us.

aristotle

1:38 am on Dec 17, 2019 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As a general rule, pages that get a lot of google traffic will be crawled more frequently than pages that don't get much traffic.

placedigit

9:43 am on Dec 17, 2019 (gmt 0)

5+ Year Member



First of all, I would like to thank you all for your answers to my query. :)

I do understand your points when you said "Google has its own take on crawling/indexing any page on the internet'

Few things about the website in question;
1. One month ago I have submitted the sitemap for this website. (yes this is a new website for Google)
2. The pages that I am talking about has all the relevant information for users. (I have already TESTED this from the Google Ads that I am running for those pages)
3. This website has individual pages for all the countries around the globe. These pages has exact information that is required for applying travel visa for that country.
4. Google is crawling HOME, ABOUT US, CONTACT US page frequently but not the individual country pages.

[edited by: engine at 11:50 am (utc) on Dec 17, 2019]
[edit reason] please see WebmasterWorld TOS [/edit]