Welcome to WebmasterWorld Guest from 22.214.171.124
For my top target term there are still some differences between the Caffein sandbox and .com results. It does not affect me and looking at them dispassionately the Caffein results (subjectively) serve the user better. More of my significant real world competitors appear and less of the buy their way to the top jokers.
[edited by: tedster at 5:52 am (utc) on Sep. 7, 2009]
Just make it live G!
These circles we are going in are getting smaller. FWIW I don't think that what is in the sandbox is ever intended to go "live". It is just a play thing for Google engineer trial and error.
I'm personally accepting the working hypothesis that the new infrastructure has already gone live, 2 months ago(ish). It explains a lot of what I have seen recently.
Anyways, the first serp for the keyword iam monitoring is totally not worthy of top position.
While rank is steady for top 3 sites in both engines, it is jumping around for positions #4+ in Old Google and the rank hasn't changed at all in top 10 for G.Caffeine in the last 4 days.
So I'm wondering if G.Caffeine is settled out.
[edited by: tedster at 4:34 pm (utc) on Oct. 5, 2009]
"Caffeine was primarily an infrastructural change"
2 points in the first sentence.
He uses the word "was", NOT "is". or "will be".
He also uses the word "primarily", not "exclusively" or "only".
It obviously is NOT just an infrastructure change.
He has continually used the word "PRIMARILY" indicating it is/was also more.
I highly recommend the entire thread and interview with MC.
[edited by: tedster at 4:36 pm (utc) on Oct. 5, 2009]
However the Old Google has sites jumping up and down by 10 or more positions.
edit: having said that a google.co.uk query now has that # in their URL. (I'm searching from Thailand).
Test something small and non-critical or just watch some pages that are the most divergent within those datasets
and figure out what you can.
Soon you'll be coming up with your own ideas to test that prove or debunk your intuitive thoughts.
Maybe Shaddows will come out of the shadows (sry, couldn't resist) and add his analysis.
There are a lot of levels which could explain the observations - or am I just thinking to complex?
I myself am seeing very simialar results to the sandbox DC however I seem to be the only one
Not the only one. ;)
Been seeing it off and on for over a month now.
I just get tired of debating the board about stuff that's
ultimately there to help them.
It's becoming more obvious now, so we just wait til others catch up.
- That it got some very good links
- That it got out of a penalty
- That some kind of "this-site-is-too-young" filter was released
Ok, now you just need to figure out a way to test these theories.
Each test leads to new insights til you narrow down some key aspect that will help your sites.
Google just rolled out a completely new dataset across all DCs in about 4-5 hours!
People in the west are going to wake up and wonder what happened.
It's was nearly seamless, with the 3 datasets cangoou pointed out, being weaved into each other like a basketball drill.
That was the Caffeine infrastructure in effect and it was quick and painless.
I suspect there will be some more movement as I still see some "ghost datasets" not incorporated
but when they do weave those in, it will happen quick!
Go to sleep or out to lunch and one will miss it completely.
I've been trying to resolve the different interpretations posited in this thread, especially by HissingSid and cangoou. I think Sid had it right:
FWIW I don't think that what is in the sandbox is ever intended to go "live". It is just a play thing for Google engineer trial and error.
Specifically, either or both of the base data and ranking algo are different to the live version, although somewhat similar. The sandbox seems to be a test of the store, fetch and batching processes. Thus, there will never be a "roll out" or converging datasets.
I'm sceptical that the live version is on the same infrastructure, for two reasons. The first is that the invitation to Sandbox makes no sense if its raison d'etre has already been fulfilled. The second is that the physical infrastructure overhall of the actual datacentres would be a massive project- there should be definitive (or at least anecdotal) evidence of this.
I have absolutely no knowledge of large scale distributed files systems. Could the mechanisms used for load-balancing be re-purposed to push all traffic to a few upgraded centres, to mimic a complete DC upgrade?
Over the past four days I've been on a bit of a link building campaign for two sites. Both are a couple years old, and about a year ago they both fell out of number one spots for various keyterms. I let it go because I was just messing with them part time and had other stuff going on. Four days later, I'm already seeing some strange affects from this link building in Caffeine.
Caffeine - Few hours ago it moved to third page for it's keyterm from way back in the SERPS somewhere. Right now, it's moved again all the way to number four on the first page. It's been no where near this position in a year.
Current Live Google - No change
This site has never ranked for the keyterm I targeted four days ago before.
Caffeine - Page 1, 8th position
Current Live Google - No change
Neither of these sites have been under a penalty before, just lost a lot of links over time and suffered the affects.
Granted, I'm not as full time into studying the search engines as many of you, but I've never seen sites rise that fast like that. I didn't pick up links that were such quality that they'd bounce a site like that that fast either, least not in the live engine. Least I'm pretty sure I didn't.
Whatever's going on, here's hoping that the Caffeine results I'm seeing become live at some point.