Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Google crawling reduced drastically



7:44 am on May 12, 2012 (gmt 0)

Two weeks back I modified the structure of my website and implemented over 200 sub-domains for different countries and sections.

Initially, everything went OK but since last 3-4 days crawling of my website is reduced drastically by Google. Not just crawl, website traffic also effected 30-40% with compare to last week.

I checked everything like robots.txt, fetch by google but no clue has been found so far.

What can be the possible reasons for this problem?


11:21 am on May 12, 2012 (gmt 0)

There could be many reasons for this. How old is your site ? Have you been penalized/filtered by any reason ? Are the subdomains new with new data or redirected from old url's by moving the old data through 301 ? where do you see the frequency of the Google Bot crawl ? check with the site: command if the new/modified pages are already cached and indexed. The best way to see if the new version is indexed is by checking the preview of the page in google search. Providing more information will help us getting a better image of what's going on. I also modified one of my sites about 2 weeks ago. Google WMT tools shows almost no crawling of my website, but the whole design was changed. I noticed that actually all of the pages are indexed in the new version but are not yet ranking for the new content.


12:35 pm on May 12, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I had a similar experience. Two months ago I launched my website in 30 new languages on subdomains (in addition to the 10 languages I already had). Google crawl rate on my main English site dropped dramatically when I did this. For me there was no corresponding drop in traffic. I did not to the analysis to see if total crawl volume was up or down. I suspect that it is about the same, maybe a bit higher. Because Googlebot has to spider all this new content, it doesn't have as much time for the content it already knows about.

Interestingly, the crawl rate to another unrelated site of mine on the same server also decreased dramatically at the same time. It could be that Google doesn't want to hit the server itself too hard and that is the limiting factor in the crawl rate.


10:47 am on May 13, 2012 (gmt 0)

Thanks jemois & deadsea for your contribution.

@ jemois, my website is approx. 3 years old. Website doesn't seems blacklisted and neither it is get penalized ever in past.

All sub-domains are new but the data is old and yes I have applied 301 to move old URLs to new one. Initially Google crawling was good on main website as well as on top sub-domains for approx. 6-7 days but later on (since last 4-5 days) suddenly it gets dropped up to 90%, means ratio dropped from 100 to just 10-12.

Using "site:" I can see good number of index pages of top sub-domains but I can't say that all pages are indexed.

One problem I just noticed in WMT is increased number of not found or 404 pages, while reviewing those pages I found that to increased the search quality we tighten the criteria of landing pages which result in generating "0" results pages which are reporting as not found or 404. Could this be the reason of this issue?

I would appreciate the contribution of experts in this thread?


10:49 am on May 13, 2012 (gmt 0)

5+ Year Member

In my site always googlebot crawls around 200.000 - 250,000 urls with redirects per day. I track it for long time ago from my access logs.

06/May/ - 252.963
07/May/ - 213.955
08/May/ - 231.638
09/May/ - 169.253
10/May/ - 24.891


7:01 pm on May 13, 2012 (gmt 0)

Google crawls the pages but it takes some time to index and serve them, this might be the reason why the Google Bot isn't crawling your site so often now. It is also possible that Google found that the "0" pages have no value for you site and for that reason not to show them in the index. Check the header of the not found pages, it might be 410 or 404 so even if there is content on the pages they will look to Google as deleted (gone) or not existing.


9:52 am on May 14, 2012 (gmt 0)

@jemois : the status of "0" records pages is 404.

Today, Googlebot has further slower down crawling as well as traffic is decreased drastically. In "Crawl Errors" 8000+ errors are reporting by WMT for 11th to 13th but in 1000 sample error pages all detected records are of before 11th.

Can anyone help me out to figure-out the problem with my website and share a reliable resource from where I can check whether my website is blacklisted or not?

Tedster, I would appreciate few words from you about the problem I am facing.


1:47 pm on May 14, 2012 (gmt 0)

5+ Year Member


Also seen a massive decrease in googlebot crawling the past weekish and no refresh in index data.

Featured Threads

Hot Threads This Week

Hot Threads This Month