Welcome to WebmasterWorld Guest from 18.104.22.168
In other BD indexes (like [22.214.171.124...] etc) I now see that some old Supplemental Results for pages and sites that no longer exist are disappearing from the search results.
I have waited years for this moment. We already had several false starts on this over the last few months. The pages were dropped, and then reappeared a day or two later.
Be gone for good this time!
[edited by: tedster at 5:27 am (utc) on Mar. 28, 2006]
[edit reason] split thread to create a new one [/edit]
Dream on folks...Google don't yet believe there's a problem. Why else would they have continued to roll BD out so aggresively in the face of such protest? Why? Because they aren't even aware of any protest. At best, word may have got around that a few people have been moaning about a slightly oversensitive supplemental threshold.
You're right, here is how your read this... Using your decoder ring (foil hats on please or this won't work)
Emmy = BigDaddy
San Francisco = Google's main office in California
Indoor = "under the hood"
Travel = "function, or work"
So the question is:
Q: Are you working on BigDaddy out of Google's main office?
A: No. BigDaddy is an under the hood operation. In fact, BigDaddy isn't even functioning right now.
Yes, that is how I read things too :)
I wonder if the PR export when it is finally done will correct the situation where Google still cant recognize the homepage as the most important page for the site etc - the homepage cant even rank for a "www.domain.com" phrase search - instead beaten by directories and spam - hmmmmz
If the PR that Big Daddy calculated has already been applied to the serps then it looks like Big Daddy cant figure it out :(
Whatever you do, always hit reload a few times. Usually you get different results on the second viewing.
Additionally, try the &filter=0 and &num=100 parameters on the end of the Google search URL, to see what happens then...
As we are one of the sites that has just started to be fixed we are in a position to know which DC's have been "Fixed" (for us) and which have not.
The two quoted above have not and we are still in "Hell" on those two.
Back in full swing on default and most others though.
are still supplemental with 45k of 195k of pages
we have 90k of 195k of pages , 50% last introduced 24hrs ago, all with no supplementals and old cache dates
results have been restored but at very diluted levels e.g. position 90 when we previously had 1 . However, we may have special circumstances since we are just coming out of a robots.txt hack attack & 180 day suspension period where supporting pages / links haven't come into play yet
I don't know the significance of the first 2 DC's , but I'm hopeful they will be updated sometime later towards the end of the current update - but it's just a guess.
The most major problems will surface again on these forums i think - and that will be after the next 7 to 10 days ( 4 days have gone since Matt Cutts said 2 weeks ).
As g1smd said in reply to somewhere on the forums saying "why wasn't it fixed before " , he replied " we're those testers"
Unfortunately, that's the way the process goes.
"However, no mention was made of a possible algo update
I wonder about that too. I've had myself braced for a major update. Will it really happen or will there just be continuous adjustments? "
Maybe Jagger was the last time we saw an announced algo update. Just like Google Dance, no more of the kind.
In fact I wrote previously about Gradual Continuous Updating
Have a great day.
Do you agree with me that this DC still looks more promissing than the rest, as to fresh cache, number of indexed pages, quality of serps etc...?
Google has not corrected the ranking problems that resulted from the canonical issues that arose over the last year yet.
The frustrating thing is that it looks like they were close.
<RK> (whatever that was - prob PR) was going through the site correctly, eg Homepage with the highest RK and then links from the homepage with the next and then as you go further in to the site RK also reduced.
EG. It seemed to understand the structure of a site.
While currently what we are still left with for sites which had ranking problems due to Canonical issues is Homepages with PR0 while internal pages have high PR. Eg. The homepage losing all of its power.
Google can not seem to fix this issue. Everyone is saying that PR values are constantly calculated and applied to the serps - so if those RK figures have been applied to the serps then Google have been unable to correct the situation for 1000s of sites.
If those RK values have not been applied to the serps we must soon have huge ranking changes.
BTW - the DC that does not seem to be showing supplementals is now [126.96.36.199...]
Even here if I do a site:domain.com check Google cant rank the homepage as the first results - Which is incredible frustrating as it seems to largely be using just fresh crawl data.
I agree [188.8.131.52...] is the first DC to show our recovering site with no supplementals in a site: search.
It also has about 15 more pages that the other recovering DC's. Very positive.
Interestingly if you do a keyword [site:http://www.mysite.com widgets] search then a few supplementals are included in the results.
A [site:http://www.mysite.com] search returns none. It looks like the site: search is hiding the supplementals but they appear when a search is keyword specific.
MC - stated a toolbar export in a couple of weeks yes! But Toolbar export would reflect what the current PR is.
What I was saying is that the current PR (not the TBPR) has not corrected the problems in the index - what I am still trying to wonder is if the current PR is the Mozilla Googlebot calculation or the old Googlebot calculation. Logic would say that we must be using the Mozilla Googlebot calculation now - or very very soon.
If it is the Mozilla Googlebot calculation then it should be a whole new calculation that would hopefully forget the errors of the old calculation (esp when it comes to split PR, canonical and Hijack issues) - wondering if this has been applied to the serps - dont personally care when it is applied to the toolbar it is when it has been applied to the serps that is important - if it has already been applied to the serps then we need to come to an acceptance sooner or later IMO that Google are unable to fix the ranking problems that occured to sites with Hijack/Canonical issues.
Not jumping up and down - acceptance :)
But when you see a DC with just Mozilla Googlebot crawled data in a site check then you would have thought that some of the issues would have been corrected.
I beleive this is starting to happen. I think it started yesterday afternoon. I have had stable serps all month, but yesterday I started to see a shift. Sort of like what i was seeing in february, but still had new serps from this month slightly mixed in to it.
By the time I called it a day I ruled it out as a filter adjustment on a few data centers, because they went back to what they have been all month. However, this morning i am seeing that shift in the serps again, only more wide spread. About 30% of the keywords I tracked tanked on one site i watch. I use kw tracker and have 10 months of rankings, so I can speak with a little confidence on this.
I had a 100% recovery in early march for this particular site.
Weather thats going to stick or not, who knows, but I can say that a lot of the pages that are still ranking well for that site are the ones that have 301 issues. Figure that out...