Welcome to WebmasterWorld Guest from 126.96.36.199
"We've detected that Googlebot is limiting the rate at which it crawls pages on your site to ensure it doesn't use too much of your server's resources. If your server can handle additional Googlebot traffic, we recommend that you choose Faster below."
Is this just a generic message or could googlebot think that some of my pages are taking long to load so it's slowing down the index rate?
In the last 30 days I've really noticed a drop in my referrer traffic with no spikes so I'm worried that Google is holding back our index rate even though I've set it to "Faster"
Yeah...... it *does* seem confusing. I would just change the wording. The "faster" option had been set for a while so I was thinking the wording meant that sinc emy page load times for lots of random fetches were high (long story) that Google was throttling me even though I had sent it to Faster.
back then (Oct 17th 06) I've had these stats:
Pages: Max/6310 - Avg/1796
Kb: Max/334995 - Avg/79489 (80Mb per day!)
Mil sec: Max/3063 - Avg/1749
After setting it to "faster" - today (Nov 15th 06), I have these:
Pages: Max/6310 - Avg/2524
Kb: Max/334995 - Avg/101801 (100Mb per day!)
Mil sec: Max/3421 - Avg/1794
So in general per day I've got Google to spyder 728 MORE pages per day! (... or it seems that way)
Looking at the graph I notice that google spydering slowed down considerably from mid-September to mid-October, which could count towards lower average number on my original post (the tracking starts from mid-August) - so the spydering from mid-Aug to mid-Sept is about the same as from mid-Oct to mid-Nov...
Conclusion: I still don't know if setting it to faster had actually made it faster... it certainly seems to be the case when comparing the last 2 months... but it's only as fast as it was 4 months ago...