homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

Client tells me to ban Googlebot on all of his sites

 10:14 pm on May 12, 2006 (gmt 0)

It's rather sad, and over reacting IMO, but a hosting client of mine asked me today to ban all Googlebot IPs from his sites (7 in all).

From a certain perspective, I can see his dilema. For the last 2 months, his sites have went well over their bandwidth allotments. But 95% of that traffic has been all Googlebot (one IP or another). As a matter of fact, from analysis of his log files (done at his request), it seems that the more Googlebot crawls his sites, the less traffic he gets from Google.

I have reviewed all of his sites. No spam, all original content. Not a lot of cross-linking ... Seems like he is doing things right. But I was unable to explain to him why Googlebot had been crawling so much, yet adding so few pages to the index.



 12:05 am on May 13, 2006 (gmt 0)

A better solution might be to restrict the areas or number of pages that Google can crawl using robots.txt or on-page meta robots tags.

Also, make sure that his Last-Modified, Expires, and Cache-control response headers are configured correctly.

This kind of suicide is usually preventable...



 3:30 am on May 13, 2006 (gmt 0)

Thanks for the advice!

I really don't want to do this, as he is just upset about the extra bandwidth charges. Maybe I'll try to give him a break.


 9:05 am on May 13, 2006 (gmt 0)

There are people who'd metaphorically ive their right arm to see the Googlebot every now and then. Even once a month would do.

I'm happy that Google presents search results as it wishes - but I find it hypocritical of Google to make such judgements about a site without even spidering it every now and then.

The data it has about some of my sites is months old.


 9:07 am on May 13, 2006 (gmt 0)

Remember that the customer is always right. At the end of the day it is your right to charge extra bandwidth, and his right to ask that GoogleBot be banned.


 9:40 am on May 13, 2006 (gmt 0)

Remember that the customer is always right.

The customer is very often wrong...

If a client has a problem, he/she will often have a suggestion for a solution based on inadequate knowledge and/or understanding of the issues. Ultimately, you may have to implement a client's request/suggestion but if you believe it to be wrong you should always brief the client on alternatives first.



 6:10 pm on May 13, 2006 (gmt 0)

Have you tried a sitemap that specifies that those pages aren't changing nearly as frequently as Googlebot is crawling?


 6:16 pm on May 13, 2006 (gmt 0)

If you haven't already tried this, I would send Google a report about it. There could be a technical problem with their crawler. Although it's more focused toward speed of requests than total bandwidth, I think this form would be a good place to start:

Googlebot Trouble Report [google.com]

[edited by: tedster at 6:26 pm (utc) on May 13, 2006]


 6:21 pm on May 13, 2006 (gmt 0)

95% of that traffic has been all Googlebot

Assuming his hosting doesn't have some antiquated low bandwidth allocation, that would be indicative of a more fundamental problem.

Are you getting googlebot stuck in a trap? Session ID's etc?


Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved