Welcome to WebmasterWorld Guest from 126.96.36.199
From a certain perspective, I can see his dilema. For the last 2 months, his sites have went well over their bandwidth allotments. But 95% of that traffic has been all Googlebot (one IP or another). As a matter of fact, from analysis of his log files (done at his request), it seems that the more Googlebot crawls his sites, the less traffic he gets from Google.
I have reviewed all of his sites. No spam, all original content. Not a lot of cross-linking ... Seems like he is doing things right. But I was unable to explain to him why Googlebot had been crawling so much, yet adding so few pages to the index.
Also, make sure that his Last-Modified, Expires, and Cache-control response headers are configured correctly.
This kind of suicide is usually preventable...
I'm happy that Google presents search results as it wishes - but I find it hypocritical of Google to make such judgements about a site without even spidering it every now and then.
The data it has about some of my sites is months old.
Remember that the customer is always right.
If a client has a problem, he/she will often have a suggestion for a solution based on inadequate knowledge and/or understanding of the issues. Ultimately, you may have to implement a client's request/suggestion but if you believe it to be wrong you should always brief the client on alternatives first.
Googlebot Trouble Report [google.com]
[edited by: tedster at 6:26 pm (utc) on May 13, 2006]