Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Setting up a googlebot crawl rate outside of GWT

         

Sgt_Kickaxe

2:03 pm on Oct 17, 2011 (gmt 0)



I'd like to prevent massive spikes in googlebot crawling activity and I'm not satisfied with using google webmaster tools to achieve this since googlebot activity increased dramatically not long before I was Panda smacked on many transactional pages.

How would I best set up a limit that once exceeded over any given period would deny access without causing (too many) ranking problems? Has anyone done this successfully? If so, how?

goodroi

2:59 pm on Oct 18, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Throttling Googlebot tends to be a bit risky. If you block access to googlebot at the server level it could look like your website is offline and Google would not want to send users to a website that it thinks doesn't exist anymore. If you slow down the page speed you might make Google think you have a very slow website and Google would not want to send users to a website with extremely slow speed.

Another option might be to look at what pages you are allowing to be indexed. If you have a million page site I doubt all million pages are valuable. You may want to block access to low quality pages. This will reduce the crawl activity and also make your site appear stronger to Google.