homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

How to set Slow GoogleBot Crawl Speed
I need to fix Google to crawl my site as very slow or specifice times, how?

 12:32 pm on Jun 24, 2008 (gmt 0)

I need to fix Google to crawl my site (3 million web page portal) as very slow or any specific time or date, how can I do that?
I have a big issue with GoogleBot crawling, itís killing my site performance, everyday I am getting so many requests from Google and itís crawling from different IPís (Data Centers)

I need Google Service for Organic Results and Traffics but at the same time I need to control the spider request. I tried Webmaster Tool >> Tool >> Set Crawl Rate >> Slow option, but there is no effectÖ

Kindly let me know if it is any other way to control GoogleBot?



 12:46 pm on Jun 24, 2008 (gmt 0)

Welcome to WebmasterWorld Senthil!

Are you sure it is Google? Some people change their browser's user agent to googlebot. This is also a common practice for data miners. Be careful not to just look at user agents but to confirm the different ips are actually owned by Google.

Also Google has multiple bots. They have one for search, adsense, etc. If you are participating in multiple google programs you should expect these bots to be visiting your pages. This will drive up the amount of bot traffic on your site.


 1:08 pm on Jun 24, 2008 (gmt 0)

Hi Goodroj,

I am very sure, it is GoogleBot spider only, and All the IPs is owned by Google Inc.

I have blocked GoogleBot IP then swiftly a different GoogleBot IP is crawl my site, and if I removed from the blocking list after that very immediately Google Spider happening on my site, I know the reason because itís well designed Link Structure (Not Spam), is there any other way to control?


 12:56 pm on Jun 25, 2008 (gmt 0)

You are already doing the proper way - using google webmaster tools. This can take some time to see results.

An unorthodox approach which I do no recommend is to return a 503 for every 4th or 5th request by googlebot. It would lower the amount of bandwidth but it might also create new problems.


 6:07 am on Jun 26, 2008 (gmt 0)

Yes you are rightÖ

But my prospect is (I hope its valid one)

Google should offer BOT crawling for specific time or day, such a great option will help lot for millions of webpage sites and Webmasters/SEOÖ

Why canít we recommend to MattCutts/Google? And how can we recommend?


 7:54 am on Aug 31, 2008 (gmt 0)

Is there any update on this?



 7:54 am on Aug 31, 2008 (gmt 0)

Is there any update on this?


Global Options:
 top home search open messages active posts  

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved