Forum Moderators: Robert Charlton & goodroi
All indications are pointing in the direction that we are approaching heavy changes in Google's index infrastructure. Matt has confirmed that several times. As such its of importance at present to keep an eye opened on possible changes on Google's datacenters.
In fact watching and analyzing changes on the DCs of high importance, IMO. Because it tells us much about what do we expect tomorrows serps to look like, and might take actions regarding our sites in good time.
Some of us are addicted DCs Watchers and we do that with passion and there is no doubt about our spirit of sharing our observations with the rest of our kind fellow members . However, we are often blamed of hijacking other threads if we have no place to post our observations and remarks ;-)
For you passionate Google Datacenters Watchers, I'm starting this thread.
Let those observations, analysis and remarks coming.
Thanks!
But, rather than wait for Google to sort it out, why don't you move your site to a new domain and sit the sandbox out - 301 everything across, you might be out of the sandbox before google fix the issues with your old domain.
Dayo_uk I'm sure you've thought about doing that, what's stopped you from doing it?
For many of us it is the hope that things would return without having to sit a year+ in a sandbox. Hindsight leads me to think that creating a new domain name could have been an option, back when this started for us, it really wasn't an option. If google had problems with 301's during that time...they probably would screwed things up anyway redirecting to a new domain. The thing is we didn't/currently don't know exactly what is going on nor for how long. We could be back tomorrow - getting sandboxed could deepen our problems financially if this were to happen. Also take into consideration the effects in other search engines. We don't want to disturb rankings during this time since money still flows from them.
Even at that our domain name is our BRAND. Making a switch to something other than/or variation of our company name can lead to confusion and degrade the brand we have built for the past 5-6 years.
This is just how I see it and what I based my decision to 'ride it out' on.
Slightly off topic..
This thread looks like its going to go a "how to deal with" direction so here's the list that our friend Reseller put together last time!
- Do a 301 redirect regarding yoursite.com vs. www.yoursite.com (canonical url problem)
- Removing 302 redirects
- Removing duplicates
- Subtle page changes and monitor SERP changes
- Create and submit a Google Sitemap
- Send feedback to Google engineers
- Optimize your site for other search engines (like Yahoo, MSN ..)
- Transfer your affected site to a spare/emergency site
- Outlet Sites Strategy
Outlet sites are multiple sites with a subset of the main site redesigned/added to to avoid duplication penalties.
Some of these would not apply if you don't think the test DC's are it.
Which in my opinion they are not.
I wouldn't either. To me that is a last resort.
Considering much of this is Google's problem not ours...we shouldn't have to move/change domains/rebrand sites to satisify them. If we play by the rules at all times...we shouldn't HAVE to do anything but allow their crawlers. It is THEIR job to sort it out, not ours. Our job is simply to conduct our business how we see fit (within those rules) supplying quality info/products/services to our visitors whether those visitors are from Google or elsewhere.
I had no idea it would take this long for Google to fix - and then we got to October time and indications were that a fix would be coming shortly during Jagger - this did not happen.
But we then have the Big Daddy DC coming shortly/during the end of Jagger - this is again supposed to fix the issue....
So I guess it is like Zues has said wait it out a bit longer - and then a bit longer - especially with G saying that a fix is forthcoming.
But so far no good.
Many posters have confirmed that their sites have been fixed by Big Daddy already including ours.
Two sides to every story.... >
Er...surely it's a matter of fact - if it ain't been fixed, it ain't been fixed - it isn't a matter of opinion!
I'd like to know how they are doing it though - I still have another site which has not even begun to be fixed yet.
That leads me to think that it will be a new bot, one that is only just in the design process; designed to fix errors in the supplemental database (MC post made no comment about it ever having run before, and didn't say "again" or "next time", just that it wouldn't be out for a while).
I have no idea why the normal crawls can't be used for this? I mean, if a page has an up-to-date cache and an up-to-date snippet, and ranks for search terms that are currently on the page as normal results, then why can't Google update their supplemental database with that information for searches for the same page where the page is still being found for words that are no longer on the page, and which deliver a snippet (for those "old content" searches) that still includes words that are no longer on the real page or in the cache.
Why is this so difficult? It sounds like a simple process.
But it is just more evidence of the difficulties Google has with Homepages in particular......
Still waiting for the fix which has been promised for soooooooo long.
But this might explain how PR disappears on a site - eg in Matts homepage current state I would guess it is a PR0 - I doubt it would pass any PR (In its current state)
Might explain why some sites went PR0 last update while still having internal pages with PR.
Matt sites has so many links to /blog that his site might not be effected typically/as badly compared to sites that rely on 95% of links going to the homepage.
The cache dates seem to be old in general (as mentioned in this thread some homepages showing cache of 9-11 months ago) - although I feel that some serps are showing results based on a newer indexing than that - and some serps do have a new cache too.
A recalculating of internal PR I feel will be the next step when Google are happy that they are correctly dealing with 302/301 and Canonicals.
Whether PR has already internally been recalculated I am not so sure.....MC has said no ranking update as yet, but it has also been said that PR is continually processed/updated behind the scenes anyway....
Another site has no results in cache: and a current one through site:
Both lost their rankings and both have current caches for all sub pages and all sub pages have maintained rankings.
Very frustrating.
Anyone see the same - any point thinking about this?
ScottD
>>216.239.57.99
216.239.57.104
216.239.53.104 <<
None of the above DCs is a BigDaddy. Sorry ;-)
In fact I see BigDaddy at this moment on my default google.com [66.249.93.104...]