Not sure what my point is, but I'm having to upgrade my server as the result of Google.
I've always prided myself on having a site that served instantly and have generally thrown all of hardware, hosting, and coding at it to make that happen.
But lately my servers lagging - to the point where it's noticable. Some digging revealed that it's a friend's site. His site has a huge, ever increasing database. And he's gotten busier over the years. We determined that it was his site that was slowing the server. There's a particular monster query his site does, and it's on a dynamic page so there's a lot of combinations that generate this query. So one big query being hit all the time is slowing the server.
The page is for an informational lookup. My immediate reaction was to think scraper, and go looking for an IP to block. Which sure enough, we found - except the IP is googlebot.
Yep, the dots all connect. I need a new server because it's slow. It's slow due to a specific type of query, and the vast majority of hits for that query are coming from Google. If Googlebot went away, I wouldn't need a new server because my server would still be fast.
There's no alternatives (we've tuned mysql etc, looked at the query, considered tinkering with Google, none are appropriate long term fixes), so a new server it is...because of Google.
Worse, it's cyclic - my sites have gotten slower as a result of this that if Google actually uses site speed in it's algo, then I'm surely getting smacked.
In the end, it's still profitable. but is there going to be a point where people realize the costs that Googlebot is creating for them, and attempt to do somehting about it? This guy actually suggested we robot.txt exclude Google from most of the site (particularly all the long tail data they're scraping from the site). I advised against - but the fix is going to cost thousands - and not everyone is going to make that same decision.