onlinesource, forgive a hasty but possibly long reply. My initial take on what you're asking... as I'm understanding your question and the documentation you link to... is that "geo-distributed crawling" shouldn't apply to you at all.
Here's the help article you link to....
Locale-aware crawling by Googlebot [
support.google.com...]
Here's the first paragraph of text in the article (my emphasis added)...
This article describes how Google uses different crawl settings for sites that cannot have separate URLs for each locale.
In the situation you describe, you apparently have separate domains, each with its own ccTLD, for each geo-location, so the single-URL condition for "Locale-aware crawling" isn't the case here.
Google divides "Locale-aware" into two basic situations...
"Geo-distributed crawling" and
"Language-dependent crawling". These apply when Google spots certain "signals and hints", as the article describes them... specifically when it sees...
- "different content
on the same URL - based on the user's perceived country (geolocation)"
- or when you are "serving different content
on the same URL - based on the Accept-Language field set by the user's browser in the HTTP request header"
...etc
Regarding geo-location, the article IMO is slightly unclear because of the tenses used... ie...
Googlebot uses well-established IP addresses that appear to come from the United States. With geo-distributed crawling, Googlebot can now use IP addresses that appear to come from other countries, such as Australia.
But the Googlebot situation, as I understand it, has been changing, and it might be clearer if Google said something like...
Googlebot has up until now used well-established IP addresses that appear to come from the United States. With geo-distributed crawling, Googlebot can now use IP addresses that appear to come from other countries, such as Australia.
There may be some technicality why Google doesn't say that, but that's the way I understand the situation.
Regarding your setup of 301 redirecting to different ccTLD sites using some sort of Geo IP module... I myself would never use IP range to set user important user preferences. Opinions on this vary, and I'll leave it for others to discuss. We have some recent discussions on the topic, but it's too late at night to hunt for them.
IP redirecting
to a different url is arguably a type of cloaking. I was initially sort of surprised that the redirects hadn't gotten you into some trouble, but as I think about it, chances are that your separate ccTLD sites are ranking on their own, and that Google is allowing the 301 redirects because Google is seeing what the user is seeing. The ccTLDs, though, should essentially be allowing the foreign sites to be ranking in their own territory... and the 301s shouldn't be necessary.
Your description sounds as though, without the redirects, you'd probably be very close to the setup that Google recommends...
IMPORTANT: We continue to support and recommend using separate locale URL configurations and annotating them with rel=alternate hreflang annotations.
I suggest simply dropping the Geo IP redirect module, and using flags or text links to let the users manually choose their language and geo preferences, if they should need to do that. Without the 301s, btw, I'm guessing that you could Fetch as Googlebot without a problem.
I should add, btw, that the above are rushed thoughts, and you should check out the linking situations and hreflang configurations of each of your ccTLD sites before making any changes.