Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Minimum Pages Crawl Rate & Supplemental Pages

Any correlation? Post your info so we all can compare

         

viphost

4:01 am on Dec 31, 2006 (gmt 0)

10+ Year Member



I have an e-commerce site with about 7000 products. In all I have about 12k-15k pages of content. In my Google Webmaster Tools under the Crawl rate section it shows a minimum pages crawled per day of 295. I have about 295 pages of content in the main index and a little over 10,000 pages in the supplemental.

I'm wonder if there is a correlation between these two numbers.

What do your stats show? (Minimum pages and how many pages you have in the main index).

Whitey

5:55 am on Dec 31, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Is this site/pages performing in the serps?

We have found that our sites, which are crawling well, are also providing nice clean "non supplemental results" - ie 1,000 of the total number of pages shown to be indexed. The correlation here is max pages non supplemental, max good results.

There are about 90,000 pages and the crawl rate sits at around an average of 9,000 per day - with some days down to 300.

We are finding that our other sites with a lot of supplementals suffer from a lack of good results in the serps and poor crawling. Similar ratios of supplemental pages is occuring. On these pages a lot of preceeding 301 activity took place.

So my first question relates to how these pages are performing, or are they in some sort of recovery mode?

Rather than merely an observation, what angle are you coming in from on this question?

[edited by: Whitey at 5:58 am (utc) on Dec. 31, 2006]

viphost

5:55 pm on Dec 31, 2006 (gmt 0)

10+ Year Member



I am rebounding from duplicate content because the URL structure on my site changed and the 301 redirects were not working. I had to change the URL structure yet again and this time the 301 redirects are working properly.

I'm not seeing any progress in getting my pages out of the supplemental index with my 301 redirects. Googlebot is crawling about 800 pages per day (according to a Goolge bot activity script I'm running to log things). Because of this lack of progress I'm wonder if Google has in some circumstances a base calculation for MAX pages to allow in main index for a site (in this case they would obviously base this calculation on historical data they have for your site.)

My thought then was if they do this sort of MAX allowable pages in index then does that figure correlate with the MIN pages crawl rate in the webmaster tools section.

pageoneresults

6:08 pm on Dec 31, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am rebounding from duplicate content because the URL structure on my site changed and the 301 redirects were not working. I had to change the URL structure yet again and this time the 301 redirects are working properly.

That's a lot of major changes to make. Its going to take a while before Googlebot sorts it all out. Remember, each time you make a major change like that, expect a latency period. So, if you've done it twice, expect at least twice that latency period.

You need to make sure that you've got all the technical issues sorted out. If you don't, you'll be adding insult to injury and there will be long term problems to contend with.