Forum Moderators: Robert Charlton & goodroi
Yes, this is exactly what we saw too, same exact issues, starting really with bourbon, fixed the bourbon stuff, everything fine, then stuff started moving pre jagger, and the movement intensified, most sites dropped out, especially spammy stuff, to me it was and is very clear that what dropped sites was not onpage factors, because they dropped in groups. Also as many have pointed out, lots of hidden text etc onpage seo survived fine.
However, that's just for the sites I'm watching. For other types of sites, it could very well be onpage factors as primary. I dont' see that on any of mine though.
This is why I don't see Jagger1,2,3 as unrelated to the two events between jagger and bourbon, and I tend to also not see bourbon as unrelated, but it's too hard to connect the dots right now.
Having 3 updates one after another is fairly difficult to decipher, especially when the stability was already damaged before. Normally you can test ideas like you did, see the result, then go to the next.
However, I do not believe google can maintain this, it is too risky long term in terms of losing users to competition, serps have to stay in some recognizable form for average users or no loyalty will build.
In our situation, one clear difference I can see between the sites that dropped and the sites that didn't is very simple: more and better backlinks, by a factor of between 2 and 5.
This was obvious before Jagger, in Sept 22 for example, and those sites are the ones that survived in jagger.
One interesting thing though that I think is worth looking into: a certain type of site, this has been confirmed by many here and elsewhere, has seen no impact at all in its serp positions from before bourbon to now. This fairly clearly indicates a distinct line that the algo is not crossing in all these updates.
I would like to know where the defining edge of that line lies. How far into competitive serp land that is.
Yippee:
"At least most of the time :p"
I doubt even 'some of the time' would be close to the mark, doesn't keep me from trying though..
Yes. It would certainly appear so!
CToS - "certain type of site" is way too long to keep repeating ;)
It would be excellent and very beneficial to define a CToS. The trouble is, as you previously mentioned, they appear to differ in their areas of dominance.
I don't think it's relevant to discuss age, registration term, etc., as it goes without saying that these sites are long-term across the board.
Although one could argue that the following figures are far from accurate as they were taken from a single stat source and therefore only represent a small portion of the actual totals, the same source was used for all sites, so there was at least this constant -- just covering my behind ;)
The 10 top sites I checked ranged from less than 2,000 to over 3 million backlinks, and unique IPs varied between 4,600 and 7,500. However, all these CToSes (have to work on the plural) remained as solid as a rock from Sept 22 onward, with perhaps the only exception being a very slight shuffle between them during Jaggar2.
Of particular interest to me is that our own site exceeded some of these sites in these areas, yet we were sent packing. So, would this relate to the infamous "Trust Rank" debate, discussed in the initial part of this thread? If so, perhaps it would be worthwhile trying to identify and define this?
This fits me. My best site and the oldest got the strongest hit (1/10th of before;exists since 1997/96 and has pre Google backlinks .. very funny indeed and naturally grown backlinks PR0 - PR 7 until recently). Another PR6 has taken a 50% hit with J3 [not as strong in english but also pre Google backlinks] :\. I do not really have sites that started post 2000 with content.
Can anyone confirm the German source, aka G softening the algo?
German source also says wikipedia has therefore risen even more as being fairly new.
This is fine on the one hand [as in giving more people a shot] but can these new adsense publishers be sure they are not cut down when the next update comes .. or the one after that ...
This is of course a hypothesis but I think you have to see Google updates as updates that have to make more money, besides removing obvious spam.
It's absolutely true... I can't understand what was the purpose of making this update at all...
If the algo is softer older web pages are not as stable anymore and more sites get more traffic. These sites can then make more money. See it as 100 sites x 100000 PI is less than 100000 sites with 1000 PI. If these potentially new publishers get hope to make money with adsense G has a bigger market than before. The bigger sites might have maxed out or run away to YPN .. This way the update makes more money for G.
This is a hypothesis. It's a discussion forum after all.
But I was trying to understand this moneywise. G is all about finding more platforms for ads. Above hypothesis generates a bigger platform for ads if smaller publishers get hope and are not always on page 50 in Google. All these smaller publishers of course will never make enough money that it is really worth it, but as with Amazon you see that doesn't really matter and the Accumulator [Google] makes money. It worked with Amazon [Google has or will have product referral soon] and tries to get away from MSN [Firefox?]
That's my guess. Happy to be convinced otherwise. :)
May I also add that Yahoo and MSN will probably do likewise in the future, so it's not really ment against G. Just thoughts on how to survive with them on the long run. :\ Short term SEO can't really be the answer.
Sorry i'm being lazy, there are so many posts in this thread and not all about the update really, only whinings or general comments (which are both fine/legitimate from my prospective though) :)
I said: " The quality sites/pages faring worse than ever."
Wrong. The intent of the comment was: SOME quality sites/pages faring worse than ever.
In many respects this is a good update. Many quality sites previously on page two or three of SERP's have risen.
My sole issue is that some sites are needlessly suffering. Seemingly because of too many pages in Supps, wrongly assinged. This is presumably because either G has got it wrong on some legit sites, where for a variety of reasons, too many pages have seemed too similar ... or because the external page issues are killing some pages/sites.
Both issues seem to be causing problems. And despite all of the many qualities of G, the other SE's are not suffering from these malodies.
The intent of the comment was: SOME quality sites/pages faring worse than ever.
Oh, that changes it a bit, much more interesting. Same question in a way as legalalien has:
However, all these CToSes (have to work on the plural) remained as solid as a rock from Sept 22 onward, with perhaps the only exception being a very slight shuffle between them during Jaggar2.Of particular interest to me is that our own site exceeded some of these sites in these areas, yet we were sent packing. So, would this relate to the infamous "Trust Rank" debate, discussed in the initial part of this thread? If so, perhaps it would be worthwhile trying to identify and define this?
Same some here. The main problem with trying to solve that riddle is that while many say their sites are 'white hat' and 'non seoed' and 'like other sites, of quality', that didn't drop, to me, anyone posting in these forums, they have ideas, they do things. You need to know for a fact ALL of the things done to a site that dropped before you can state that a 'quality site' dropped for no reason.
As we unravel one mess, I get more and more depressed hearing all those 'other things' that have been done. The drop I've seen is no mystery to me at all. The only mystery is which specific thing triggered the drop.
I haven't yet seen such a thing, 'for no reason' that is. There was another thread about this supposed thing, but I couldn't see the site in question, so again, it wasn't, and isn't, possible to say anything conclusively. So just what is the difference?
RE plural: I believe, though I can't find dictionary confirmation, that the plural of CToS is CToS, like fish and fish.
other SE's are not suffering from these malodies.
What was the name of the guy who wrote the Google song about something-or-other on his mobile phone? Perhaps he can help me write a song about being alone in the Google forum on a Friday night -- how sad is that!<<
Good morning LegalAlien and all
His name is Matt Waddell, Google Mobile Team
and the name of the song is:
Get lost and found on your phone
You can see on Matt's blog a link to the song
[mattcutts.com...]
Enjoy :-)
G has got it wrong on some legit sites, where for a variety of reasons, too many pages have seemed too similar
A variation on this theme that appeared to start pre jagger but persists - I have a site where G indexed thousands of pages via the IP rather than site name,
and then appeared to confer a duplicate filter or other penalty on the real site, killing G traffic. It's not a great site and I'd experimented with other things so I can't be sure this was the problem but it's a strong candidate.
Why not have a site review process?
Google is now at that stage in a company's life that it is so big that it does not take individual complaints and requests for help seriously. They do not have to. They can just ignore you and you will either go away or go on using their service.
We are all just like public transportation and elevators. If they miss one of us there will be another right behind.
With time they, like all companies do, will pass. So make the best of what they offer now. Life goes on.
Can you imagine the effort involved with a site review process, and the arbitrary decisions needed to make that work?
Of course in a sense it's all arbitrary. But as long as the big looming image of something nearly omnipotent exists, questioning the algo is harder. Once the man behind the curtain is revealed, it all seems much more fallable. ;-)
Also, another quick question, if I use allinanchor:, allintitle: and / or allintext: in google, my site shows roughly where it used to be pre-Jagger - why then, am I now a few pages down on many keywords? Someone mentioned yesterday I could be sandboxed (but I have never been before) or I could have been penalised (which I couldn't find a reason for!)
EG. GG and MC would probably say it was finished when all the code is out on a DC (and I assume that this is the case on the 66.102.9.104 DC.)
However, from our point of view the update is over when all that code has had an effect on our websites - now if that code has lead to a change in the indexing - then IMO it needs to have a (few) crawl(s) based on the new code for the new index to develop.
Which maybe why there is normally a bit of flux after updates and why people come back or drop even a while after an update and are wondering why there is no update thread at WebmasterWorld.
That is me being very optomistic though. The real me things what is being shown on 66.102.9.104 will pretty much be what we get.
>>That is me being very optomistic though. The real me things what is being shown on 66.102.9.104 will pretty much be what we get. <<
I guess your very optimistic side is still valid. Just keeping in mind the flux that followed Bourbon and the changes which took place during it, should keep you and the affected fellow members on the optimistic side :-)