Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Bot - How often getting crawled

         

fashezee

6:56 pm on Apr 23, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How often is your site getting fully crawled?
Based on our G-Console, on average, we are getting visited once every 90 days.

keyplyr

12:26 am on Apr 24, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I see a partial crawl ( approx 30 to 50 pages) every day. I see a full crawl once a month.

If you have a lot of backlinks, you'll see Googlebot more. If you (or others) post links on Social media, you'll see Googlebot more.

lucy24

3:05 am on Apr 24, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



we are getting visited once every 90 days
Do you mean that each individual page is re-crawled that often? It probably isn't useful to average out all pages across your whole site, because there are too many variables. Pick some high-profile pages and look more closely at the numbers for them.

There are two basic factors. One is how interested Google is in a page; backlinks are obviously one consideration. The other is how often the page changes. With time, they get a sense of how often they need to check for updates.

keyplyr

3:41 am on Apr 24, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you don't already have one, using a sitemap.xml is an effective method of letting Googlebot know about all your pages and if/when they are updated. Add the sitemap.xml location to your robots.txt file.

fashezee

11:14 am on Apr 24, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We have over 7 millions pages; I've noticed a deep crawl every 90 days or so.
Daily, we get 11K to 25K pages crawled. Would this be considered average?

Travis

12:20 pm on Apr 24, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



I think Google is adapting its crawl rate to optimize resources. Pages which are updated often are re-crawled often, ... after a while, Google can certainly predict the frequency at which a page is update (or not), as well as how often a site is adding content.

The "quality" / "authority" value that Google is giving to a site, is obviously impacting the craw rate.

in my case, from 100.000 pages, Google visits 5.000 per day, and very constantly all along the day. From time to time
it stops crawling for 24 hours, which makes me panic :)

lucy24

6:19 pm on Apr 24, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



7 million, yikes, I’d say concentrate on getting authority for the site as a whole--and coding a good internal search--rather than relying on search-engine referers for each of those pages. Think of something like {major online retailer}, where the ordinary user behavior is
go to site >> search for widgets
rather than
go to google >> search for widgets >> get sent to page on 7-million-page site

I'm assuming your site is involved with selling something, because nobody has seven million unique, informative and well-written articles.

NickMNS

6:49 pm on Apr 24, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@lucy24 I have more than 7 million pages. An information site that is well written and informative. Not all information is in the form of an article.

@fashezee my crawl numbers are similar to yours.

keyplyr

7:30 pm on Apr 24, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



We have over 7 millions pages; I've noticed a deep crawl every 90 days or so.
Daily, we get 11K to 25K pages crawled. Would this be considered average?
The term "average" doesn't apply since all sites are different in a vast amount of ways.

The ratio and frequency of crawl sounds about right to me. However, as stated above, you may see Googlebot more often or less often depending on the activity of those pages.