Welcome to WebmasterWorld Guest from 50.17.16.177

Message Too Old, No Replies

New 'Custom Crawl Rate' Option in Google Webmaster Tools

     
2:40 am on Dec 6, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Google has now introduced a "custom crawl rate" option in Webmaster Tools.

The custom crawl rate option allows you to provide Googlebot insight to the maximum number of requests per second and the number of seconds between requests that you feel are best for your environment...

Googlebot determines the range of crawl rate values you'll have available in Webmaster Tools. This is based on our understanding of your server's capabilities. This range may vary from one site to another and across time based on several factors.

Webmaster Central Blog [googlewebmastercentral.blogspot.com]

2:58 am on Dec 6, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Apparently this custom crawl rate is only available for root level sites - not sites not hosted on a large domain like Blogspot. Google says they assign special settings in that type of case.
1:59 pm on Dec 6, 2008 (gmt 0)

Junior Member

5+ Year Member

joined:Sept 7, 2007
posts:80
votes: 0


i am not on blogspot and it won't allow me to change the crawl rate. The message I get is your site has been assigned special crawl rate settings. Perhaps this is due to the fact that my site is relatively new
7:05 pm on Dec 6, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:May 7, 2005
posts:194
votes: 0


I put my crawl rate at 200% more than the default. I saw that it goes up to 900%.
7:55 pm on Dec 6, 2008 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 5, 2002
posts:529
votes: 0


Pretty cool how it is dynamic. One of my sites has a peak of 10 pages per second, whereas another has a peak of 0.5 pages per second.
11:55 pm on Dec 6, 2008 (gmt 0)

Preferred Member

10+ Year Member

joined:July 25, 2003
posts: 608
votes: 0


Has anyone noticed how long it takes for the change to take effect? I increased one of my sites but I do not see a change in the Google Bot spider rate on site.
1:22 am on Dec 7, 2008 (gmt 0)

Preferred Member

10+ Year Member

joined:July 25, 2003
posts: 608
votes: 0


I can answer my own question now, it appears to take about 2 hours.

This is a great feature.

What would make it even better would be if Google would allow throttling by time of day. I would love to throttle up to the max rate on the over night hours and bring it back to normal during the day. That would be great.

7:13 pm on Dec 7, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 16, 2004
posts:854
votes: 0


I dont understand the value in this...

"However, setting it to higher value than the default won't improve your coverage"

7:27 pm on Dec 7, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 6, 2002
posts:1825
votes: 21


Set it to 900% and the yo-yo effect will probably go wild :)
3:54 pm on Feb 17, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Apr 22, 2008
posts: 151
votes: 0


>>>Set it to 900% and the yo-yo effect will probably go wild :)

You mean we have to avoid it or what ?

What is this crawl rates actually means as i increased it to 0.5 requests per second but no change in site traffic , but yes in google wmt i can see lot of network unreachable errors , and hosting company site is up 24 hours

So plz help me what to do

Should i change crawl rate?

What positive and negative effects 0.5 request per second have ?

and where i can get full details regarding this feauter

If u can give details here i will be thankful.