Welcome to WebmasterWorld Guest from 184.108.40.206
Pages: Max/6310 - Avg/1796
Kb: Max/334995 - Avg/79489 (80Mb per day!)
Mil sec: Max/3063 - Avg/1749
and for this site I do have the option to go "Faster"
I wonder how much faster can it go...
I'll switch it to faster and will keep you guys posted...
TBW - looking at the graph I've had big spike end of Aug and right now...
"We've detected that Googlebot is limiting the rate at which it crawls pages on your site to ensure it doesn't use too much of your server's resources. If your server can handle additional Googlebot traffic, we recommend that you choose Faster below."
I wonder if the choise 'Faster' will have an effect and if it will speed up the proces for me (I'm still waiting to see more pages from my new site -- other than my root page -- in the SERPs).
My only concern about selecting faster (If I had that choice) is if google finds too many new pages too fast that it will throw some sort of spam filter on the site.
For the time beeing my website only have about 60 pages that can be indexed.
[edited by: OutdoorMan at 10:42 pm (utc) on Oct. 17, 2006]
How will this folks out there on WebmasterWorld?
I wonder how quickly will it kick into action and if this mean pages will get cached quicker on all DC's?
Now all i ask is for G to identify the problems that cause a site to be filtered, so that webmasters can see if they can fix or send a reinclusion request legitimately [ but at least this is a big step ]
[edited by: tedster at 7:34 am (utc) on Oct. 18, 2006]
That said, all the new tools are pretty cool; I appreciate all the improvements they keep making to Webmaster Central. Now, if they'd just make it so that I don't have to scroll through four pages to see all my sites...
I actually identified a problem (my bad coding) with a site of mine that I suspected, but didn't realize how bad it was until I saw that since the beginning of Oct(when I made the code change), the number of pages Googlebot was crawling daily drastically dropped, while at the same time, the time it was taking the bot to download the pages, drastically increased.
Anyone noticed any improvements after changing it?
I have a small (in pages, but heavy on content) website, probably smaller than the rest of you guys' websites -- 60 pages in all.
Before the choise of changing crawl rate was available, I only had 13 pages indexed. Now I have 58 pages in G's index.
Maybe 58 pages is a small step for mankind, but it's a huge step for me ;)
and you can see the change on number of pages crawled per day?
I can see a change in how often Googlebot crawls my site:
October (31 days - the last 13 days october are with the "faster" option toggled on):
Googlebot 384+31 4.02 MB
(I believe that approx. 50-60% of the 384 hits came after I've changed the crawl rate to "Faster".)
November (7 days so far):
Googlebot 122+3 1.45 MB
That's the statistics I have (AW Stats).
[edited by: OutdoorMan at 2:59 pm (utc) on Nov. 7, 2006]