Welcome to WebmasterWorld Guest from 54.160.163.163

Message Too Old, No Replies

Client tells me to ban Googlebot on all of his sites

     

catch2948

10:14 pm on May 12, 2006 (gmt 0)

10+ Year Member



It's rather sad, and over reacting IMO, but a hosting client of mine asked me today to ban all Googlebot IPs from his sites (7 in all).

From a certain perspective, I can see his dilema. For the last 2 months, his sites have went well over their bandwidth allotments. But 95% of that traffic has been all Googlebot (one IP or another). As a matter of fact, from analysis of his log files (done at his request), it seems that the more Googlebot crawls his sites, the less traffic he gets from Google.

I have reviewed all of his sites. No spam, all original content. Not a lot of cross-linking ... Seems like he is doing things right. But I was unable to explain to him why Googlebot had been crawling so much, yet adding so few pages to the index.

jdMorgan

12:05 am on May 13, 2006 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



A better solution might be to restrict the areas or number of pages that Google can crawl using robots.txt or on-page meta robots tags.

Also, make sure that his Last-Modified, Expires, and Cache-control response headers are configured correctly.

This kind of suicide is usually preventable...

Jim

catch2948

3:30 am on May 13, 2006 (gmt 0)

10+ Year Member



Thanks for the advice!

I really don't want to do this, as he is just upset about the extra bandwidth charges. Maybe I'll try to give him a break.

Phil_Payne

9:05 am on May 13, 2006 (gmt 0)

10+ Year Member



There are people who'd metaphorically ive their right arm to see the Googlebot every now and then. Even once a month would do.

I'm happy that Google presents search results as it wishes - but I find it hypocritical of Google to make such judgements about a site without even spidering it every now and then.

The data it has about some of my sites is months old.

vincevincevince

9:07 am on May 13, 2006 (gmt 0)

WebmasterWorld Senior Member vincevincevince is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Remember that the customer is always right. At the end of the day it is your right to charge extra bandwidth, and his right to ask that GoogleBot be banned.

kaled

9:40 am on May 13, 2006 (gmt 0)

WebmasterWorld Senior Member kaled is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Remember that the customer is always right.

The customer is very often wrong...

If a client has a problem, he/she will often have a suggestion for a solution based on inadequate knowledge and/or understanding of the issues. Ultimately, you may have to implement a client's request/suggestion but if you believe it to be wrong you should always brief the client on alternatives first.

Kaled.

ronburk

6:10 pm on May 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have you tried a sitemap that specifies that those pages aren't changing nearly as frequently as Googlebot is crawling?

tedster

6:16 pm on May 13, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



If you haven't already tried this, I would send Google a report about it. There could be a technical problem with their crawler. Although it's more focused toward speed of requests than total bandwidth, I think this form would be a good place to start:

Googlebot Trouble Report [google.com]

[edited by: tedster at 6:26 pm (utc) on May 13, 2006]

trillianjedi

6:21 pm on May 13, 2006 (gmt 0)

WebmasterWorld Senior Member trillianjedi is a WebmasterWorld Top Contributor of All Time 10+ Year Member



95% of that traffic has been all Googlebot

Assuming his hosting doesn't have some antiquated low bandwidth allocation, that would be indicative of a more fundamental problem.

Are you getting googlebot stuck in a trap? Session ID's etc?

TJ

 

Featured Threads

Hot Threads This Week

Hot Threads This Month