homepage Welcome to WebmasterWorld Guest from 54.237.54.83
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Getting Into The Bigdaddy Index
agraddy




msg:722697
 5:40 am on Feb 14, 2006 (gmt 0)

I have a site that is indexed fairly well in the old Google, but in the Bigdaddy index only about an eighth of the pages have been indexed. I realize I am not the only one dealing with this issue. I have read a lot of posts recognizing the problem, but I have not really read any solutions. I wanted to start a thread so that those of us who are struggling to get our sites indexed in Bigdaddy can figure out how to overcome this problem. And those of you who are finding more pages indexed in the Bigdaddy results over the old results maybe could offer some insight on what may have caused the difference.

Thanks in advance to everyone who posts.

 

Kufu




msg:722698
 6:32 am on Feb 14, 2006 (gmt 0)

I just think it is a matter of time before the page-count goes back up to normal; that's of course if the site isn't being hit by any penalties.

One of my sites dropped from almost 11k pages to 47 (in BD). I'm sure it will show up fully in the index once Google is done with everything it is doing for BD.

agraddy




msg:722699
 6:42 pm on Feb 14, 2006 (gmt 0)

Do you think adding a google sitemap would help?

Kufu




msg:722700
 6:57 pm on Feb 14, 2006 (gmt 0)

I've never used the sitemap as there have been issues with it. But what you can do is just submit your site to the Google Sitemap without an actual map. You will then be able to follow reports which tell you if Google is having an issue indexing any part of your site.

All you need to do is have a Sitemap account, and add your site to the list, and then upload an empty file (Google provides the file name) to your server, so that Google can verify you are the owner.

jrs_66




msg:722701
 7:41 pm on Feb 14, 2006 (gmt 0)

--- I realize I am not the only one dealing with this issue. I have read a lot of posts recognizing the problem, but I have not really read any solutions.

I would say that a lot more sites (including mine) are seeing the opposite effect- a dramatic increase in pages included. The overall index size is much larger than the cruddy old index, so I don't see how this could be considered a google-wide 'problem'.

Kufu




msg:722702
 8:02 pm on Feb 14, 2006 (gmt 0)

I really think this is just a temporary issue as Google moves from the current index to BD.

agraddy




msg:722703
 8:08 pm on Feb 14, 2006 (gmt 0)

jrs_66: Have you noticed any reasons why you would have more pages in the Bigdaddy index compared to previously?

Do you see the Mozilla Googlebot in your logs? I see the regular Googlebot a lot, but I rarely see the Moz bot.

Do you use Google Sitemaps? I currently do not.

I realize that it is not a problem for the people who have found that more of their pages are indexed in Bigdaddy than before, but for those of us who are finding the opposite to be true, it is a very big problem. Besides, by finding out what the differences are, you will be able to have the knowledge for the future to know what helped get the site indexed and what did not.

agraddy




msg:722704
 9:45 pm on Feb 14, 2006 (gmt 0)

I agree that it is a temporary problem, but when part of your income is from your sites, a temporary problem can be a big problem. We also have no idea how temporary of a problem this is...meaning Bigdaddy is supposed to be completely rolled out in maybe a month, but that doesn't mean that the index will have greatly increased the number of missing pages by that time.

All right, I have been doing a little research and thought I would share what I have discovered so far.

It seems like most large sites (news sites, portals, etc.) have gained major ground in the new index. Webmasterworld itself has added about 330,000 pages.

I've come across a few major sites that have lost ground, but there are not many.

On the other hand, it seems like a lot of high traffic blogs have lost pages in the new BD index. In fact, blogspot has lost about 10 million pages in the new BD index. I did find one blog that I frequent that had more in the new index. (No, my sites are not blogs.)

I've found a couple opensource software packages that have lost major grounds in the new BD index. For instance, I found one that had lost about 100,000 pages, and another one that has gone from 2,590,000 pages indexed on the old Google to only 352,000 in the new BD index (I'm not even suffering that bad!).

I find it interesting that a lot of blogs that I visit have lost pages in the new BD index. I'm wondering if somehow, Google is working to discount the blogosphere by not indexing as much. That seems highly unlikely to me because I think it would be hard to define "the blogosphere", but it does seem that the Moz bot hasn't wanted to work as hard in the blogosphere.

That still does not explain the opensource software websites (I've found others that have greatly improved in the new index).

I'm going to continue to try to figure out what is going on. My next plan is to work on implementing Google sitemaps to give the bot a better idea how my sites are setup.

If anybody else has any ideas or observations, they will be greatly appreciated.

Oh, these are the two data centers I am using to compare results:

[72.14.207.104...] - Non-Bigdaddy
[66.249.93.104...] - Bigdaddy

sjgreatdeals




msg:722705
 10:52 pm on Feb 14, 2006 (gmt 0)

I am seeing this too, alot of sites i check have alot less page count on big daddy then on the current listings. Still seeing huge fluxuations, i watched on site go from 180,000 to 30,000 in 5 days and now is back to 50,000.

agraddy




msg:722706
 12:13 am on Feb 15, 2006 (gmt 0)

All right, I created a Google Sitemap and uploaded it for one of my sites that is slow in getting indexed by Bigdaddy. I'll post my results in the future (yes, I do have links from external sites, but the Mozilla bot doesn't seem to like them as much as the original Googlebot).

This is a new forum site (only about 2 weeks old).

In the old index it has 62 pages indexed. In BD it has 2 pages indexed.

My sitemap has a total of 312 pages. Obviously the page count will be quickly changing, but the sitemap is dynamic, so it will scale with the site.

As always, if anybody else has any insights or observations they are greatly appreciated.

BobSco




msg:722707
 11:40 am on Feb 15, 2006 (gmt 0)

HI

I've got 3 sites up which had 166k pages between them. Went down to 11k pages overnight of 13th. With attendant loss of customer traffic!

One of the sites has just come back to about 80% of what it was.

I use sitemaps, mainly because G said to use 'em. I found that new pages were indexed a lot quicker after setting up sitemaps.

What have they done in Bigdaddy DC update?

Bob

LisaAndrew




msg:722708
 12:42 pm on Feb 15, 2006 (gmt 0)

Bob,

Using the big daddy DCs they are preparing a new database of the websites and they are using a different spider to view and index the websites.

The existing spider is: Google bot
The new big daddy spider is: Mozilla bot

Once Mozilla bot gets all the information similar to google bot, Mozilla bot will replace google bot.

So, until this happens the results will be varied.

We hope that this big daddy DC update will take place until end of March.

-Lisa

goodroi




msg:722709
 1:33 pm on Feb 15, 2006 (gmt 0)

sitemap can help google find alot of pages on your site but there are have been different reports of glitches with sitemap so be careful

jrs_66




msg:722710
 2:04 pm on Feb 15, 2006 (gmt 0)

---Do you see the Mozilla Googlebot in your logs? I see the regular Googlebot a lot, but I rarely see the Moz bot.

Yes, I see quite a bit of Mozilla Googlebot

---Do you use Google Sitemaps? I currently do not.
No, I do not use site maps.

Rugles




msg:722711
 2:20 pm on Feb 15, 2006 (gmt 0)

This is an interesting development. For a term I regularily check, it has gone from 5 or 6 million results to over a 100 million. A huge jump.

But the quality of the SERPS still looks good.

rxbbx




msg:722712
 2:33 am on Feb 16, 2006 (gmt 0)

i use a sitemap at two of my sites.. with most dynamic content..

i can tell you that it helps.. as long you make no mistakes with it.. sending a spider in the wrong way isnt that smart.

Heywood_J




msg:722713
 3:39 am on Feb 16, 2006 (gmt 0)

Are you suggesting that there is a separate sandbox for Big Daddy?

Jordo needs a drink




msg:722714
 5:06 am on Feb 16, 2006 (gmt 0)

I would say that a lot more sites (including mine) are seeing the opposite effect- a dramatic increase in pages included. The overall index size is much larger than the cruddy old index, so I don't see how this could be considered a google-wide 'problem'.

No, from what I've read it's about 50/50. On one of my sites, I've gone from 120,000 pages indexed to 900. I've read the posts to see what's up, and it's seems to be about 50/50, give or take.

I think it will even out over time, 120,000 pages indexed for this site was too many and 900 isn't enough.

stinkfoot




msg:722715
 7:12 am on Feb 16, 2006 (gmt 0)

Weirdness ...

site: on standard = 2/3 of the pages of site: on big daddy for one of my sites. Big daddys indeex seems to be 100% of it though .. guess im lucky there .. BUT ...

I did a search on google for some unique text on one of my product pages. In standard google "this unique sentence" brings up no results.

I did the same search in big daddy and low and behold ... it was there on the page it was suppose to be on and .. yes .. it was unique as it was intended to be.

I went and looked at the cache and it was date july 2005 .. the unique text was not there then just a basic page.

Whats this all about .. the index and the cache are different here ... the last recorded visit of googlebot to that page was 23 jan 2006 ...

Google is just weird

BobSco




msg:722716
 10:03 am on Feb 16, 2006 (gmt 0)

Lisa

"We hope that this big daddy DC update will take place until end of March. "

Thanks for that Lisa. I now know how long to wait and see.

Bob

Wlauzon




msg:722717
 12:34 pm on Feb 16, 2006 (gmt 0)

On one of our sites, went from 481 on the old google to 11,000+ on Big Daddy.

hmm....

We don't even have that many pages....

dc_dalton




msg:722718
 4:23 pm on Feb 16, 2006 (gmt 0)

I'm seeing a massive amount of activity on all my sites from the Mozilla Google bot ever since the 13th. I am also seeing sites that had 1 or 2 indexes in G now having 200 - 1000 pages indexed (thank god).

It looks like nothing BUT Mozilla GBot over here (checking 10 sites)

agraddy




msg:722719
 6:20 pm on Feb 16, 2006 (gmt 0)

I tend to like to come up with theories about life (and search engines) and I tend to keep them until they are proven wrong (sometimes I keep them a little longer - I'm a little stubborn).

So here is my theory:
As you will see earlier in this thread, I decided to try out Google Sitemaps. It is pretty clear that the Mozilla Googlebot is associated with sitemaps. That is the bot that checked on my sitemap file.

As I had read previously though, sitemaps do not help get your site indexed (meaning submit your site and get it indexed). My previous research seemed to indicate that sitemaps simply alert Google to possible pages they may have missed but it really will not help get your site indexed unless you have good external links pointing to the site. This seems to still hold true.

I have decent external links. One of my sites is a PR 5 (the one that only has about 1/8 of its pages indexed in BD). In fact this site just experienced very heavy activity by the old Googlebot.

I think what is happening is that the Googlebots almost have a neighborhood they are in. Meaning Moz Bot has found a bunch of sites, and it is going after those voraciously. But, for the sites that are outside of its current neighborhood, they really are not going to get indexed very well until the Moz Bot starts seriously indexing that neighborhood. So, if your site is getting links from other sites that are not indexed well in BD, then there is probably a good chance that your site is not going to get indexed well.

Right now, I'm going to take a guess that if you can get good links from sites that are well-indexed by BD, then there is probably a pretty good chance that you will be able to attract the Moz Bot.

Well, that is what I am seeing right now. If you have any data to support or deny what I have posted here that would be great. The more we learn, the better.

On a final side note, if you have not tried out Google's Sitemaps, you may want to check it out. I was pleasantly surprised with all of the features and options they offer.

dc_dalton




msg:722720
 8:36 pm on Feb 16, 2006 (gmt 0)

I actually tend to agree with your "neighborhood" theory. I have 16 sites on the same server and from what I've seen the Big Daddy bot has been slamming all of them over the past 4 days. Only one of them has a google map on it but they all seem to be in the crosshairs.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved