Forum Moderators: Robert Charlton & goodroi
I received a communication last night that discussed a buzz going around that Google is going to shake up how it deals with backlinks. Specifically that Google is going to give less weight to less relevant linkage. This source also heard from another direction that a big change was coming in the next few days.
Anyone know anything?
Goog has no way to figure out "relevance" outside of the words used in the anchor text.
I'm sorry but I don't buy that although Allinanchor:[search term] is currently returning almost identical serps to [search term] for the term I covet most.
I guess that I'm searching for an answer that I believe in. In a way it doesn't really have to be the whole truth just something that I can use as a direction. Something has changed, perhaps just for commercial terms, but something has changed and I want to do something about it.
Cheers
Sid
Things do seem still to be in flux. One of the oddities I've been seeing the past few days has been the complete DIS-appearance of some urls that have VERY on-theme backlinks - definitely makes you go hmmm... Reminds me of the way that home pages sometimes go missing while the data churns.
whitenight - why tuesday? are you privy to some inside information?
Quite honestly, my sources are infinitely more reliable than any possible sources inside the 'Plex.
You can follow my predictions from the Nov update for more insights on the hows and whys of my data - Nov Update [webmasterworld.com]
There didn't appear to be any data-processing or data-set rebuilding.
There wasn't any obvious site-supression
There has been no re-folding of data as yet (unless anyone else has seen churned sites staying at the top)
"Quality filters" do appear to have been missing BUT there has been precious little in common between the muck that rose to the top- so what filter(s)?
Sites that were affected were only caught on a few terms. Can we posit that it was a semantic 'issue', rather then a page-scoring issue? i.e. G was comparing how words related to each other, rather than pages.
Suggestions, observations and analysis welcome.
It's not down to anything intrinsic to the site, page, domain or other 'structural' information. So, its not raw link volume, trust, PR, navigation (megamenu or otherwise), schema, Branding, 'good neighbourhood', link trading, server issues, hosting, scripting, dynamic/static.
It COULD be co-occurance, anchor text(multi-generation, not single), on-page semantic analysis, off-page semantic analysis, cross-page SA, natural languge analysis, anchor text variance or any number of language-based criteria that elude me.
But frankly, I don't think it was any of that.
What it looked like to me was a new penalisation model or mechanism. Lots of stuff that should be penalised from a QA POV were floating round the SERPs. At the same time, other some rankings were severely depressed, "-950" style.
The bunch of non-similar penalties applied in May could have been used as a control group, to see how they performed within the test environment.
Possibly things will be clearer when normality returns, but the lack of discernable patterns is frustrating.
Anything slightly less cryptic to share?
so it seems google has a basic (or old) dataset which is used as a fallback for when they update their algorithm. then the updated algorithm (or ghost dataset as whitenight calls it) takes weeks to run before it is added to the basic / old dataset. how does that sound?
I guess they missed that quality data for home pages once again.
They didn't "miss" them last time, remember?
(otherwise, how could i have predicted it?)
That was MC playing a little cat and mouse game(read: part-truth, part FUD) with me/us on how these new updates do their funky "roll-in" nowadays.
It's ugly, but as before, all the goodness of the algo is right there for all to see.
So now's (the next 48 hours) the time to be getting screen captures to see what gets moved, removed, and added back in... for further analysis ;)
so it seems google has a basic (or old) dataset which is used as a fallback for when they update their algorithm. then the updated algorithm (or ghost dataset as whitenight calls it) takes weeks to run before it is added to the basic / old dataset. how does that sound?
It's close enough for the purposes of "not stressing out" :)
Glad you read it and understood.
It started oh...so... slowly on Sunday, which is why there have been some reports of "returns" in this thread.
A couple of my rankings for two main keywords returned from the void over the weekend on one of my sites. :) Took me by surprise so there was definitely something afoot! Unfortunately, I still have a handful of other rankings from a different site that did not get their original rankings back. I'm nervous about this next impending update. It's either more good news or the next set of bad news.
In this case I'm happy with the SERP because it pushes some "criticism" pages down below the fold for the first time. But on some searches, seeing so much top-level page real estate going to Universal Search might be problematic.