A lot of bot runners somehow think they'll get past block filters if they include Googlebot in their UA string. IMO this is self-destructive and just the opposite of how most site owners filter.
For example, I block anything calling itself Googlebot, allowing only those from valid Googlebot crawl IP ranges. So the above UA was actually blocked because it used Googlebot... however, it was also blocked because the range was AWS. While I currently allow a couple dozen UAs from AWS ranges, this is not one of them.
Dig up an old IANA list and you'll find huge numbers of /8s allocated to single corporations (Microsoft, poor dears, missed the land grab) and governmental entities (not just US govt but all over the place). Bit by bit, almost all of these are now getting apportioned elsewhere as their owners realize they don't actually need 2^24 external IP addresses ... and what the previous owners don't want, AWS does.
Not necessarily; I was referring to the “AWS keeps grabbing” earlier. Many of those new-to-us AWS ranges are IPs we never set eyes on until recently, because they were allocated to single-users who, well, didn't use them. I first noticed it with Merck some years back, and now plenty of others are following suit.