I am also seeing that using the site: command too. I have seen page count fluxuate huge on the test datacenters day to day, up or down 10,000 or more pages. Hope that settles down
On 184.108.40.206: if #1 for the query sf giants is returned as giants.mlb.com then you are seeing BD results.
I am seeing huge fluctuations in site: searches. Well over a 50% decrease in some cases.
I have seen one site go from 38,000 down to 260. PR 3 to PR 0.
>>I have seen page count fluxuate huge on the test datacenters day to day, up or down 10,000 or more pages
I'm not doing any datacenter watching, but one of my sites fluctuates (mostly on the high side lately) from under 1000 pages to over 20,000 pages. I'm assuming that's Big Daddy, and if it is, he has it wrong by about 20K pages.
Am I right to assume that even if the big daddy update were to officially go live, that rankings would change yet again, once the TBPR update happens? Or are these likely to happen at the same time?
For example, on a big daddy search, my site ranks #3 for its main keyword, but my site currently has a Page Rank of 0 due to a redirect to a different /dir. Once the TBPR update happens, the new /dir should have a Page Rank of either 4 or 5. At the time of the TBPR update, wouldn't the rankings improve for this site as well?
When toolbar PR updates, Google lets you know the value of PR that they were using a few weeks ago.
You still have no idea what the real value right now is....
Oh I see. So when big daddy does hit, it will be using pages that already had PR before, even when the toolbar wasn't showing it...
For example, now the TB is showing webmasterworld.com's PR as a 7. But if you pull the xml results from google 220.127.116.11 right now, you'll see webmasterworld.com has a PR of 8 (PR is marked with the tag <RK> in the xml file shown below):
Is that right? I was told that the <RK> tag does represent the PR value for the url, but I am not sure if that is true or not.
Big Daddy has been live for weeks.
Search results will continue to fluctuate every day, particularly as long as any datacenters remain non-Big Daddy.
BigDaddy has not been live for weeks. We've been monitoring all the data centers and while it has sporadically popped in and out of 2 or 3 datacenters at a time it has only fully rolled out today.
I have noticed a problem in the BD test data . Google , over a period of some time have started mapping major corporations full names and short names . I guess it was intelligent of their part to do so, built is creating wrong results in serps.
I would take a real life example to illustrate this, because ot may not be possible to explain without it.
Assuming I am searching for 'General Motors Widgets'. Google is automatically assuming that General Motors is also called GM , so maps the query to results of 'G M Widgets'.
On BD search positions 2 , 3, 4 & 5, the results are for a term refereing to 'Genetically Modified Widgets' also called GM Widgets, are appearing .
Note that the orignal search was for the term 'General Motors Widgets' and not 'G M Widgets', but results shooting towards the top slot are for 'Genetically Modified Widgets' .
This seems to fail the objective of intellegent name mapping, if i may call it.
I could not notice this in any of the earlier updates, and results were ok, but i am noticing this in BD DCs now. I have tried a few acronyms and the results are far from satisfactory.
I don't think anything can be drawn by the current mess on datacentres, and we should all wait and see.
I ahve some data centres showing
Same pr, backlinks, ranking and pages in index as usual.
Some with everything the same, but different pr
Some with nothing at all on my site(its 4 months old, so i figure those data centres showing older results)
Some showing no backlinks but same pr.
Its a mess, no pattern.
Big Daddy has been being served for weeks. It isn't "fully rolled out" today. It's just on more datacenters than before, but it has been live for some time. It will just be served more often now, presuming these datacenters continue to show Big Daddy.
Google has dozens of datacenters. The results differ on them. Nothing has changed about that basic fact.
That seems to be a recurring theme in many of the posts here. Is it possible that Google's site: command may be going the same direction as the link: command and Google Page Rank?
In other words, it is possible that what we have been confusedly witnessing over the past several months (since pre-Jagger) is the transition into the site: command becoming a useless tool (or at least an unreliable tool).
a. Page Rank still exists, but it is no longer transparent, and we can no longer keep up with our "real" score, since it includes hidden data and parameters we can no longer check.
b. All of our IBLs (in-bound links or backlinks) still exist, but we can no longer check them, and we are only served an apparently random, ever changing sample, which doesn't seem to be anything more than a lottery selection.
c. The site: command was the last useful thing we had available to check if our domain is "healthy" in the G index. i.e. which pages are actually "there". However, for the past year, we have all been panicked by the Supplemental Results, and now we are maybe seeing some of them go away randomly. Hmm. It is not likely G will EVER do away with Supplemental Results! They need them as part of the "historical" record of the domain, part of the G patent, right? However, the Supps have caused great furor (so did Page Rank and Backlinks once upon a time), so why should we think that G will continue to show us our own underwear? It will always be there, but they don't have to show it to us.
Many people here have been posting that the number of indexed pages for their site: command has been fluctuating all over the place. (Since pre-Jagger and now it seems to be getting even more "random" in Big Daddy). While some people say theirs is great, others are shocked at what is missing, or how G has thousands when there should be hundreds, etc.
I am afraid we may be witnessing the demise of our dependence on the site: tool
>>That seems to be a recurring theme in many of the posts here. Is it possible that Google's site: command may be going the same direction as the link: command...
Huh, that's hilarious; I was just thinking the same thing: they've broken the site: command too. However, I'm only seeing it on one 'dynamic' (mod_rewrite) site. Doesn't appear to be my fault: MSN & Y! and -BD apparently have it right.
added: I've been checking Google site: on my other mod_rewrite sites after noticing this craziness and they show realistic page counts, and I haven't noticed any flux on site: queries for these other sites.
IMO I think that Google are trying release their serps from the control of the SEO's. In the future it might mean that the only way to get high listings is to undertake no SEO work at all. I imagine that Google would see this outcome as quite liberating.
I just get the feeling that Google treats SEO techniques a bit like parasites ... feeding from a resource that they wish to control completely themselves. Let's face it, if Google become the only ones who can sell top listings ... there's a lot of money to be made.
Sorry for another paranoid rant ;-)
All the Best
I've got pr on all new pages< 4 months old) the same as they were 4 months ago.
On older pages that had not changed in that time, pr is the same.
I'm getting backlinks as they were 4 months+ ago.
I don't think anything new is being rolled out, its all old.
Watch Out for Update Allegra-II
Good morning Folks
Life on WebmasterWorld has always been exciting. Always something happen that keep your blood pressure high and fluctuating . And you might have noticed that watching Google datacenters is a serious business not designed for the weak souls :-)
BigDaddy is spreading, but its quality status remains the same. We should be looking soon for improvements or lack of the same on the new infrastructure of Google index, not only on how and when BigDaddy is spreading.
Our friend at the plex Kentuckian Matt had an 8 a.m. meeting and a busy day yesterday. And that mightbe the reason why he hadn't much time to post a BigDaddy weather report.
As you know, we are approaching very critical days where many thing are expected to happen. PR update, backlink update and most important the algo update Allegra-II sometime next mounth (probably first week of February).
Therefore we should focus on algo and rank changes too, not only BigDaddy spreading.
Unfortunately, it seems BigDaddy hasn't brought yet good news to our fellow members whos sites have been suffering of canonical and supplemental issues. Sorry folks.
Wish you all a great Google Datacenters Watch day.
God bless WebmasterWorld community.
Ahhhhh yes the exciting time of PR and backlink updates, binary roll outs and ranking fluctuations - All of this sounds almost like poetry to me after a time!
OK guys...time for action now!
What is all the confusion about? Why not just go with what Matt Cutts said?
"In fact, Bigdaddy is now visible at two data centers: 18.104.22.168 and 22.214.171.124."
Everything else is just guesses.
Actually, BD is on any DC where a search for sf giants has giants.mlb.com as result 1.
Anything ELSE is a guess - including assuming those IPs always have BD results ;-)
|BD is on any DC where a search for sf giants has giants.mlb.com as result 1. |
if that is true then my site has 41000 pages listed on this BD-DC, which were removed from my site last year.
The present search results show between 700 to 900 .
So - is BD bringing dead (non existant pages any longer ) pages into index again? i wud become a top ranker again, but this time without my pages being there in my site now.
i doubt it.
Not necessarily. The BD results are simply a set with a specific canonical issue fix. Specifically, they represent the results where the search for 'sf giants' (no quotes) returns the mlb.com results rather than the sfgiants.com results, both of which redirect to the the mlb.com result.
|So - is BD bringing dead (non existant pages any longer ) pages into index again? |
added- What I meant to also say is that there have been changes in the indexes that contain bigdaddy results, so the canonical fix that represents bigdaddy is independent of some other changes that may cause your problems. It could also be a related and unintended result of the canonical fix.
Intresting that ansible brings up that tool/method that display RK - which could be Page Rank.
The XML page seems to show the top 10 links or so for that site - with normally the actual domain page being top.
For sites with Canonical/Hijack problems the domain does not seem to appear anywhere in the top 10 - let alone top.
Soooooo - either internal PR has not been recalculated for these hijacked/canonical prob pages (I hope so) - or this is a penalty/problem that wont go away :(.
Starting to wonder if there is really anyway back for these sites hard and long term hit with Canonical/Hijack issues.
The underlying message for Googleguy and MC - you can take Googlebot to the homepage again (after redirect, canonical problems etc) but can you sort out the ranking penalty that hit them?
< Update? Anyone? >
Nope..everything looks pretty much the same from here!
Lots of dcs have gone roll-back today for me. First change for a while.
Hate to post in this thread but I do see a change..
I don't normally watch as the new algo's take over, but one of my main sites has had a steady increase of traffic day after day for about 10 days now. Is it possible that the increase is due to Big Daddy rolling out over more and more datacenters? (It's a pretty substantial site. About 130,000 pages and traffic is now up 70% from the norm.)
Does anybody care to guess what percentage of DC's are showing Big Daddy results now?
It's hard to tell. BD seems to rotate in and out on the majority of DC's. More out than in from what I've been seeing.
i am seeing the fresher data and some good serps shifting in the non-BD datacenters. my site's index in google is been fixed better in the non bigdaddy ones.
are they going to flush the new data in big daddy at next update?
One thing that I've noticed as I work with BigDaddy (It's live by me) is that I've got a big problem with supplementals outranking "good" pages. I see this when using the site: search.
For example when I looking for:
I find that pages that do not have a backslash and have since been 301'd to the correct version are outranking the correct page. The 301'd version is showing as supplemental, but it is outranking the correct page.
I also removed pdf versions of my pages and have since 410'd those away, yet those old pdf pages (showing as supplemental) are outranking the html versions.
Anyone see this before?
I am observing the same thing. Furthermore, my pages have been 301'd for 4 months. And the correct pages are listed as url only. The old versions still listed with title/desc but supplemental.
| This 173 message thread spans 6 pages: < < 173 ( 1 2 3  5 6 ) > > |