Forum Moderators: open
----
I'm starting this thread because another member suggested such would be a good idea because the main Google update thread is cluttered with posts like "OMG, I've been dropped in the new index!" and "Yippee, I'm now #1 on a key SERP". This thread is ONLY for serious, generic discussion of changes that you are observing with the new algo in this update. As in things like "Looks to me like PR is less important this month, and anchor text of inbound links counts more.", etc. How your site is doing has no relevance here unless you can explain why you think so in terms of a general algo update.
This one?: [webmasterworld.com...]
is it possible that -sj or -fi has gone live in Canada...I still find it odd that last time I looked, -fi was running in aol. even if aol uses just one source, you would think Google would have ensured it was not -sj and -fi until they were ready...
but if people bolt to other se's because these results are flaky, I have to rethink how to get more traffic. and it would have to involve less focus on big G...
We've been comparing -fi and -sj for the last week to other categories we watch, and while -sj seems to be holding its own versus www, we sure don't see that for -fi. I see numerous examples where the obvious first choice has fallen well off the top spot, or where spam has come out better. Over time that has to loosen G's grip on the search market.
So far it's just a concern, not a panic (assuming GG is right regarding -sj being the lead horse). To hedge my bets tho, I'm looking at expanding our other fronts in the traffic wars. ;-)
allanp73,
all I can say is YIKES! And it's really WAY to early for happy hour too...nothing to do but go back to work.
There are several clear errors in the database. One is that the site descriptions are three months old or more. Second, the PR0 penalty for expired domains has been dropped. Third, several new sites included in the last index has gone. Fourth, new sites which have been visited by Googlebot are still not in the new index despite being around for two cycles.
To me, it looks like the underlying data used for generating this is about three months old, combined with some newer data and freshbot results. It looks like someone has tried to fix the problem by overlaying the newer results on the out-of-date index and has come up with a monumental mess.
It does NOT take eleven days for the results to move onto www. This smacks of damage control.. and the damage is out. I've seen AOL using the faulty results and I understand that some Yahoo results appear to be from the new database. It appears that Google are trying to limit the damage.
However, let's not be harsh. It really looks to me like they're having major problems with the database, but they're managing to isolate the bad results and appear to have stopped the rollout. This, I guess, is why Google's procedure is to rollout new results in the way it does. Every update is a huge IT project and they have a good success rate.. this looks like a partial failure that they're trying to control. This happens all the time in IT projects. Imaging doing Y2K every month, and I think you'll understand how the Googleplex must work.
Until now - since G accounts for about 60% of our traffic - G was the #1 focus as far as marketing planning and work effort (about 60% of our time).
Now, I think we need to reverse that focus...G 40% , and ALL OTHER 60%!
If by some miracle this does not adversely affect G, it will be easy enough to return to current MO. But like I always say, if you don't have a PLAN B, you have no plan at all! (actually, I stole that ;-) )
Ironically, our traffic is up...I wonder if that means that our sites our cr*p...
My traffic is up about 22% with just a couple of datacenters having the new data. The keyphrases driving my traffic have shifted noticeably. Traffic is entering at deeper pages that have lower pagerank but higher content. Pagerank still matters, but the content seems to matter more. Many of them are larger pages too.
Talking to other webmasters in my area, and large information heavy sites have seen a spike in the last few days. I'm talking sites with mostly unique content on every page and lots of it.
Manufacturers are doing well if they have information heavy sites.
Retailers seem to have taken a good hit if they do not have real unique content. They still show up in a lot of the SERPs, but their traffic has dropped enough to notice.
GG has confirmed that -sj and -fi are of a "different nature"...what does this mean? Presumably more than just tweaking their algos. Is the index being build differently now? More emphasis on freshness? Seems like a 'yes' to us, based on our viewing of recent Gbot visits (and we're not the first to speculate about that).
In addition to a *possible* move towards more emphasis on freshness, we think Google is testing/evaluating at least the following key algo elements:
Variations in Importance of Backlinks
From where we sit, strong evidence that -fi is dampening certain kinds of backlinks versus -sj (motivations: greater relevance, less spam?)
Communities and Hubs
We see evidence of greater emphasis on backlinks between highly related sites
Spaminator!
Some of the categories we watch are more spammy that others. We see greater variation in the SERP's in the spammy categories than in the non-spammy categories. Hmmm. What does that tell you?
Anyone else? Let's avoid "my site dropped from # 3 to...#333" and instead focus on WHY "my site dropped from # 3 to...#333"
----------
MAJOR CAVEAT: We're looking at a minute number of sites compared to the total index, and we also know how complex the G algo is likely to be. There could be MANY other elements causing the changes we see; it's quite possible that none of our conclusions are correct...
Wackmaster, I would concur with the conclusion that there is variation in the -sj and -fi algo's as far as vaious KINDS of backlinks. More specifically, I think that they are testing a *further* reduction in the importance of backlinks from not-very-related sites versus what was already in the algorithm.
This might also relate to the conclusion about communities. The sites we own that have fewer total links but more links from highly related sites seem to doing better in both -sj and -fi.
I would add that I'm not at all convinced that 'freshness' (whatever that means) is having much of an impact on rankings...at least where I'm looking.
Actually, I understand the concept of freshness, freshie, everflux, etc...
I should have been more specific. What I meant was that I've been reading some speculation about this index putting more emphasis on freshness than in past updates. Well, what's freshness? A date line that updates every day? 5% new content each month? New sites?
G already seems to reward sites that update more frequently than those that don't.
My comment was meant to note that at least in my lone category on the Web, I see no evidence that site rankings have been impacted in any way by "freshness" other than what was already the case. But I have no clue what G might be doing in this update on that front that I'm not seeing or understanding.
I'll be more specific next time! ~~~ :-)
Since whatever is going on is not done, and the SERPs are in no way stable, how we can learn anything just yet?
We all have a bunch of theories, but there are still huge chunks of data that have not been factored in, so to draw any conclusions would be premature in my opinion.
I think Google is working on a rolling update instead of the monthly dance that we have all come to know and love.
Their just working out the bugs on how to calculate and merge Freshbot crawls into the database and have that act more as a deepbot crawler. It would not suprise me to see backlinks and indexed pages start appearing on a weekly basis. Of course, that is just my opinion but I also believe I read a post from GG somewhere where he said he would like to see a rolling dance as well.
Right now, we get to see differences between G's data centers. We THINK we know why some of the SERP's are different in our categories in the various data centers. If Sites A, B and C do much better than Sites D, E and F in -sj versus -fi, and we know what elements are common to A, B and C that are not present in D, E and F, then we have an idea of what they are testing.
If Sites A, B and C then rise in the SERP's once G truly updates, and the others drop, we can draw conclusions about where G is going. Works for us anyway.
Many of us old timers are lucky in that we have sites that have been around for a long time and are widely recognized. The links are established and the sites have not been badly affected by the loss of more recent links.
However, that is certainly not uniform and some important sites have fallen to varying degrees.
GG Quote >>Phrases like "gradually" and "over time" are cues that some types of data definitely can't be brought in overnight<<.
Fair enough, but some indication of timescale beyond that would certainly help many of the newbies significantly. Days? Weeks? Or longer? It would certainly help keep temperatures down a little if at least some sort of projection was possible. I do understand that might be difficult though.
On or more theoretical level, I assume we are largely talking about the links picked up on the last full crawl (and maybe slightly earlier ones)? Talking of which... what is the future of the crawl? OK, maybe you won't answer that (I don't really expect you to), but clearly the Fresh v FullCrawl relationship is changing. That is probably the most interesting feature of this exercise.
I'm only a newbie, so my opinion probably isn't worth much, but that seems to correlate with my experience.
One of my two sites is suffering now - and that's despite more links (in the new index) and improved page rank. What the site doesn't have is much text content. Two thirds of the pages on the site are photo pages with no text at all and Google seems to have dropped them completely. (Or is that normal?)
I don't know if I'm right, but I'm stuffing text in anyway....