Forum Moderators: Robert Charlton & goodroi
If you have read Matt Cutts' blog, in one of his posts about 302 redirects and Bigdaddy he mentioned:
Matt says:
My only point is that the new infrastructure at the Bigdaddy data center will let us tackle canonicalization, dupes, and redirects in a much better way going forward compared to the current Google infrastructure.
Looking at your site structure of dynamic pages and a set of static pages you specifically created for the search engines it might be that Bigdaddy recognizes this content now as duplicate and that you have been penalized for that.
Personally, I welcome this. You should have enough scope in 100+ pages to attract attention to what you do, and then provide your own site-level search into the details.
Has the advantage of keeping the searcher on your site, as well, and Google can forget about the 99,900 pages that aren't relevant to most of us and concentrate on the core.
Sitemaps give you a way of pointing the Googlebot in the right direction.
Google is now downloading my sitemap and robots around once a day - it seems to be a 22-hour cycleŽat times.
Big Daddy has a huge number of sites that are nowhere near their non-bd levels of pages.
In fact, almost every site I check.
It is completely illogical to suggest that 'the days of getting 100K+ indexed are gone' and that it would benefit anyone.
The beauty of search engines is that you search for 'blue model 455 widgets' and you get a page about 'blue model 455 widgets'. You suggest it would be better to just land on any old site mentioning 'widgets' and then hunt around that site for specifics.
Futher, that would also just mean that we would have to register many many domains instead of just working and building 1 quality site - again this is completely against everything thats come out of google.
Compare a large number of sites against the old infrastructure and I beleive you will find the BD index is not anywhere as complete as the older one, and this topic has been discussed in more depth here: [webmasterworld.com...]
Compare a large number of sites against the old infrastructure and I beleive you will find the BD index is not anywhere as complete as the older one...
This may be a very valid observation but what some are suggesting here is that pages are being kept out of BigDaddy because they are deemed to be dupe pages. Therefore a smaller index. Not sure if I agree this is happening, just want to restate what I think is being said.
> In fact, almost every site I check.
Isn't that what I said?
Why should Google host 100,000+ pages of someone's catalogue at no charge? Those days are gone - stop looking at it as a problem because it's not going to go away. It's now a situation.
You will no longer be able to get every nut and bolt indexed - you have to make priority decisions.