Googlebot is designed to be a good Netizien so that it doesn't hog resources, or slow a site to degrade a site visitor's experince, says Google's Gary Illyes, in a blog post about Google Crawl Budget.
The blog post goes on to describe what affects the crawl rate, including the performance of a site. Seach Console limits won't necessarily speed up or slow down a crawl, but it'll make googlebot aware if you particularly want to slow down crawling.
Popular urls will be crawled much faster, and the system is devised to avoid staleness of indexed urls.
Factors affecting crawl budget
According to our analysis, having many low-value-add URLs can negatively affect a site's crawling and indexing. We found that the low-value-add URLs fall into these categories, in order of significance:
Faceted navigation and session identifiers
On-site duplicate content
Soft error pages
Hacked pages
Infinite spaces and proxies
Low quality and spam content
Google Crawl Budget - Googlebot Crawling Rates and Demands [webmasters.googleblog.com]
There's an FAQ on there, too, which it's quite clear many will already know, such as whether crawling is a ranking factor. The short answer is no. Any URL on a site represents part of the crawl budget, including AMP urls, CSS, JavaScript, and long redirect chains.
It's worth a quick read over the blog post.
Additionally, this earlier document on Crawling and Indexing is still relevant.
[
webmasters.googleblog.com...]