Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Googlebot's crawl rate in WebmasterTools

         

Broadway

9:18 pm on May 17, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



When I look at the crawl-rate data in WMT, it says Google crawls 300 pages/day. My website is 400 pages.

Does this mean (pretty much) that Google looks at my entire site and evaluates pages once every two or three days? Or is some lesser evaluation going on.

Also, in regards to the ExpiresDefault directive in my .htaccess file, if it is set to a week (including html), yet google comes more often than that, does google pay attention to the directive or do they still go ahead and evaluate the page even though their return visit is sooner than stated by the directive?

tedster

1:06 pm on May 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



For some sites, 300 pages per day might be 30% as frequent re-crawls of the same handful of pages, but most of the time you're assumption is pretty much right.

WRT to the the ExpiresDefault directive, it's a bit server intensive to lean on it as your first line of action. As far as I know, the really heavy lifting should be done by the 304 response to an if-modified-since request.

g1smd

1:41 pm on May 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm not so sure if "300 pages" actually means 300 "different" pages, or simply "300 pages including repetition". I suspect the latter, as I have seen it say more than the actual number of pages the site has on occasions.