Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Datacenters Watch: 2006-01-09

Observations, Analysis and Remarks

         

reseller

9:17 am on Jan 9, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Folks

All indications are pointing in the direction that we are approaching heavy changes in Google's index infrastructure. Matt has confirmed that several times. As such its of importance at present to keep an eye opened on possible changes on Google's datacenters.

In fact watching and analyzing changes on the DCs of high importance, IMO. Because it tells us much about what do we expect tomorrows serps to look like, and might take actions regarding our sites in good time.

Some of us are addicted DCs Watchers and we do that with passion and there is no doubt about our spirit of sharing our observations with the rest of our kind fellow members . However, we are often blamed of hijacking other threads if we have no place to post our observations and remarks ;-)

For you passionate Google Datacenters Watchers, I'm starting this thread.

Let those observations, analysis and remarks coming.

Thanks!

UK_Web_Guy

5:14 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



I really do sympathise with those that have www v's non-www issues and have had for 12 months.

But, rather than wait for Google to sort it out, why don't you move your site to a new domain and sit the sandbox out - 301 everything across, you might be out of the sandbox before google fix the issues with your old domain.

Dayo_uk I'm sure you've thought about doing that, what's stopped you from doing it?

arubicus

5:24 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



"Dayo_uk I'm sure you've thought about doing that, what's stopped you from doing it?"

For many of us it is the hope that things would return without having to sit a year+ in a sandbox. Hindsight leads me to think that creating a new domain name could have been an option, back when this started for us, it really wasn't an option. If google had problems with 301's during that time...they probably would screwed things up anyway redirecting to a new domain. The thing is we didn't/currently don't know exactly what is going on nor for how long. We could be back tomorrow - getting sandboxed could deepen our problems financially if this were to happen. Also take into consideration the effects in other search engines. We don't want to disturb rankings during this time since money still flows from them.

Even at that our domain name is our BRAND. Making a switch to something other than/or variation of our company name can lead to confusion and degrade the brand we have built for the past 5-6 years.

This is just how I see it and what I based my decision to 'ride it out' on.

johnhh

5:46 pm on Jan 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Um - lurking here as every time I look at test DC's I decide I'd rather not..

Slightly off topic..

This thread looks like its going to go a "how to deal with" direction so here's the list that our friend Reseller put together last time!

- Do a 301 redirect regarding yoursite.com vs. www.yoursite.com (canonical url problem)
- Removing 302 redirects
- Removing duplicates
- Subtle page changes and monitor SERP changes
- Create and submit a Google Sitemap
- Send feedback to Google engineers
- Optimize your site for other search engines (like Yahoo, MSN ..)
- Transfer your affected site to a spare/emergency site
- Outlet Sites Strategy

Outlet sites are multiple sites with a subset of the main site redesigned/added to to avoid duplication penalties.

Some of these would not apply if you don't think the test DC's are it.

Which in my opinion they are not.

zeus

5:55 pm on Jan 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



UK_Web_Guy - I thought of that 3 month after I was hijacked by the googlebug302 links, because google said they had no troubles in there serps and I knew thats was a lie, but anyway I thought or what the hell another month, but it has been 16 month and search engines dont understand why some have a simular site in there hands.

Miop

5:59 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



I wouldn't move my site and risk the sandbox - it's too well established.
I have two issues with www v. non www and / v. index.php to get resolved. Site was downed in September and now just seeing the correct pages showing for the search terms in good positions, but not all of them - I would guess that G has to offload the non-www pages in their index first as well as connect all 3800 pages to the home page rather than index.php. This feels like it's taking *eons*, but seeing as there is progress, albeit slow, I'm waiting! I didn't think it would take this long though - I do not think that waiting for a sudden resolution to this problem to appear on one server and roll out on all the others is realistic. It's slow and gradual, but it is happening.

arubicus

6:24 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



" I wouldn't move my site and risk the sandbox - it's too well established. "

I wouldn't either. To me that is a last resort.

Considering much of this is Google's problem not ours...we shouldn't have to move/change domains/rebrand sites to satisify them. If we play by the rules at all times...we shouldn't HAVE to do anything but allow their crawlers. It is THEIR job to sort it out, not ours. Our job is simply to conduct our business how we see fit (within those rules) supplying quality info/products/services to our visitors whether those visitors are from Google or elsewhere.

Dayo_UK

6:39 pm on Jan 11, 2006 (gmt 0)



Yes, I have considered 301'ing the whole site too - I really dont want to though due to the reasons discussed above.

I had no idea it would take this long for Google to fix - and then we got to October time and indications were that a fix would be coming shortly during Jagger - this did not happen.

But we then have the Big Daddy DC coming shortly/during the end of Jagger - this is again supposed to fix the issue....

So I guess it is like Zues has said wait it out a bit longer - and then a bit longer - especially with G saying that a fix is forthcoming.

But so far no good.

Ellio

7:01 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



>>>>But so far no good.<<<<

In your opinion Dayo.

Many posters have confirmed that their sites have been fixed by Big Daddy already including ours.

Two sides to every story....

Miop

7:13 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



<In your opinion Dayo.

Many posters have confirmed that their sites have been fixed by Big Daddy already including ours.

Two sides to every story.... >

Er...surely it's a matter of fact - if it ain't been fixed, it ain't been fixed - it isn't a matter of opinion!
I'd like to know how they are doing it though - I still have another site which has not even begun to be fixed yet.

Rainie

7:24 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



Fixed? Well, maybe canonicals. (Sorta -- the index page is still not listed first.)

Rankings or even a hair of a chance? Nope, not for us anyway.

NoLimits

7:43 pm on Jan 11, 2006 (gmt 0)

10+ Year Member



I have a couple 4+ year old domains that didn't have 301's in place until late September (post jagger).

The ONLY problem I have seen get resolved is the URL only issue. Still seeing www/non-www issues up the wazoo for these older sites.

g1smd

9:01 pm on Jan 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




Matt Cutts said something like "Supplemental googlebot" won't be out to play for quite a while... in a recent blog post. Prior to that there were no indications that different Googlebots are resposible for the normal and supplemental indexes.

That leads me to think that it will be a new bot, one that is only just in the design process; designed to fix errors in the supplemental database (MC post made no comment about it ever having run before, and didn't say "again" or "next time", just that it wouldn't be out for a while).

I have no idea why the normal crawls can't be used for this? I mean, if a page has an up-to-date cache and an up-to-date snippet, and ranks for search terms that are currently on the page as normal results, then why can't Google update their supplemental database with that information for searches for the same page where the page is still being found for words that are no longer on the page, and which deliver a snippet (for those "old content" searches) that still includes words that are no longer on the real page or in the cache.

Why is this so difficult? It sounds like a simple process.

MLHmptn

12:20 am on Jan 12, 2006 (gmt 0)

10+ Year Member



The simple thing for all of us would be for Google to just come out of their cave and tell us EXACTLY what causes SUPPLEMENTAL RESULTS. Also why are we giving any feedback at all on this Big Daddy DC when Google can't seem to inform us WTH is going on?! Another classic case of Webmasters helping Google for Google's gain and not ours. Thankfully I get plenty of traffic from the other two engines and don't have to rely on Google clearing up it's SUPPLEMENTAL index BS. Nothing is more irritating than your pages being tagged SUPPLEMENTAL. Why have any supplemental results at all Google if they provide no incentive for the search user?

BillyS

2:33 am on Jan 12, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Why is this so difficult? It sounds like a simple process.

Because the machine now controls the man.

Dayo_UK

10:59 am on Jan 12, 2006 (gmt 0)



To be fair - the results for MC domain are accross the DCs not just on Big Letdown.

But it is just more evidence of the difficulties Google has with Homepages in particular......

Still waiting for the fix which has been promised for soooooooo long.

reseller

11:11 am on Jan 12, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm just wondering, and maybe thinking loud...

How does an unsearchable site as Inigo's get PR7?

unsearchable for www.mattcutts.com

unsearchable for mattcutts.com

Thoughts?

Dayo_UK

11:17 am on Jan 12, 2006 (gmt 0)



Not sure Reseller

But this might explain how PR disappears on a site - eg in Matts homepage current state I would guess it is a PR0 - I doubt it would pass any PR (In its current state)

Might explain why some sites went PR0 last update while still having internal pages with PR.

Matt sites has so many links to /blog that his site might not be effected typically/as badly compared to sites that rely on 95% of links going to the homepage.

Ellio

11:53 am on Jan 12, 2006 (gmt 0)

10+ Year Member



Our homepage went from PR6 to PR0 but most internal page remained PR3 to PR6.

Homepage now ranking on BD but still PR0 on toolbar.

May be higher PR behind the scenes. A visable PR update is required soon!

Dayo_UK

12:40 pm on Jan 12, 2006 (gmt 0)



I would not like to guess what (as in when) PR is used on the BD datacenter.

The cache dates seem to be old in general (as mentioned in this thread some homepages showing cache of 9-11 months ago) - although I feel that some serps are showing results based on a newer indexing than that - and some serps do have a new cache too.

A recalculating of internal PR I feel will be the next step when Google are happy that they are correctly dealing with 302/301 and Canonicals.

Whether PR has already internally been recalculated I am not so sure.....MC has said no ranking update as yet, but it has also been said that PR is continually processed/updated behind the scenes anyway....

Ellio

2:47 pm on Jan 12, 2006 (gmt 0)

10+ Year Member



Our cache is current.

cbin500

6:13 pm on Jan 12, 2006 (gmt 0)

10+ Year Member



One of my sites has a cache of feb 27 05 through the cache: search and it lost its rankings. However the site: search shows a current cache if you click on cached.

Another site has no results in cache: and a current one through site:

Both lost their rankings and both have current caches for all sub pages and all sub pages have maintained rankings.

Very frustrating.

jrs_66

7:02 pm on Jan 12, 2006 (gmt 0)

10+ Year Member



I also have a site with no cache in BD.

jenkers

7:51 pm on Jan 12, 2006 (gmt 0)

10+ Year Member



whenever the Big Daddy test results are online I get a lot of referrals on a few new sites I've launched in the last few months. Is there any chance that G might have improved their algo to the state where they can relax a filter against new sites or is it more likely that not all filters are employed on the temp results?

Anyone see the same - any point thinking about this?

cbin500

9:10 pm on Jan 12, 2006 (gmt 0)

10+ Year Member



mine actually have cache and rankings in BD but not on reg Google DC's

Ellio

4:56 pm on Jan 13, 2006 (gmt 0)

10+ Year Member



Big Daddy seems to have been up all day today on Google.co.uk

ScottD

5:17 pm on Jan 13, 2006 (gmt 0)

10+ Year Member



Are these Big Daddy testing DC's as I am now getting the same results for my site as I get in 66.102.9.104:

216.239.57.99
216.239.57.104
216.239.53.104

The change that is common in all of the above is that my site is back from the dead, with the suspected canonical issue apparently resolved

Ellio

5:29 pm on Jan 13, 2006 (gmt 0)

10+ Year Member



None of the above are Big Daddy.

Always try the "sf giants" search first.

result needs to be No.1 giants.mlb.com/ to be Big Daddy.

g1smd

8:00 pm on Jan 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




See: [66.249.93.104...]

jenkers

8:35 pm on Jan 13, 2006 (gmt 0)

10+ Year Member



those appear to be Big Daddy results but with a bigger dataset - at least for the sites I watch there are more pages indexed in there now. Does that mean we're getting closer?

reseller

9:16 pm on Jan 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Good evening Folks

ScottD

>>216.239.57.99
216.239.57.104
216.239.53.104 <<

None of the above DCs is a BigDaddy. Sorry ;-)

In fact I see BigDaddy at this moment on my default google.com [66.249.93.104...]

This 190 message thread spans 7 pages: 190