Welcome to WebmasterWorld Guest from 18.104.22.168
Something is now becoming clear, and I think it's time we put away the name "950 Penalty". The first people to notice this were the heaviest hit, and 950 was a descriptive name for those instances.
But thanks to the community here, the many examples shared in our monthly "Google SERP Changes" threads and in the "950 Penalty" threads themelves, we now can see a clearer pattern. The demotion can be by almost any amount, small or large -- or it even might mean removal from the SERP altogether.
It's not exactly an "OOP" and it's not the "End of Results" penalty. From the examples I've seen, it's definitely not an "MSSA Penalty" -- as humorous as that idea is. (Please use Google to find that acronym's definition.)
It's also not just a Local Rank pheomenon, although there are defiitely some similarities. What it seems to be is some kind of "Phrase-Based Reranking" - possibly related (we're still probing here) to the Spam Detection Patent [webmasterworld.com] invented by Googler Anna Lynn Patterson.
So let's continue scrutinzing this new critter - we may not yet have it nailed, but I'm pretty sure we're closer. The discussion continues:
[edited by: tedster at 9:18 pm (utc) on Feb. 27, 2008]
So...I have no idea if this is something I've done, or something Google's done. I'm anxious to see how everyone else does with their sites now.
The big question, of course, is will it last or is this Google messing with our minds?
How's it going?
[edited by: tedster at 9:06 am (utc) on Feb. 8, 2007]
My site got hit with the 950 Penalty about the same time as everyone else, I think it was sometime in November-December? (I'm not where I can look back to see right now.) There are mentions of it in older threads here at WebmasterWorld.
I haven't made any major changes to my site, just going over it looking for anything that could possibly be questionable. I did remove keywords from outgoing links to other sites, and tried to reduce the number of keywords in the text, but I didn't change much because it reads fine as it is, and to change words around would make it seem strange to site visitors, so I left most of them as they make sense.
I can't really explain it, but I really feel the issues are not so much on the pages hit, but on the site as a whole, and there's no explanation as to which pages/search terms are picked to hit.
This could be another bubble like we had last month, where all rankings were restored, and then they hit bottom again 48 hours later. If things seem to stabilize for a couple of weeks, I'll feel better. But I'm still going to go through my site looking for anything questionable and fix it. I'm sick of being at the bottom, below all the garbage.
My site is now back in the main index, since Feb 1, I deleted approx. 20 internal links (total of 90000) fron each deep pages (.com/#*$!/#*$!/#*$!.htm) that linked beetween them, only kept on these pages that link to the page up (.com/#*$!/#*$!.htm)
Imagine if all the webmasters could do this on all their deep pages, it will be more easy for the robots! And I do not see the pertinence of all these links for the visitors.
And I guess I don't understand how a link from an unrelated site could be beneficial. Didn't Google just roll out anti-Googlebombing to put a stop to this? So if you have a totally unrelated site about hair rollers, and it links to your Widget site, assuming a Widget has nothing to do with hair rollers, how could that be helpful?
This is very interesting to me because I think one of my sites has a lot of backlinks from theme-related pages but on unrelated sites. If Google applies this Local Rank by taking the top 1000 results and then only counting the backlinks that occur within those results and then applies the anti-googlebombing algo which may also discount links from unrelated sites then a lot of our links would be discounted.
Our rankings dropped for all search terms that we were ranked high for but some that we weren't ranked high for did rise a bit but still aren't anywhere to bring traffic. Earlier we had a drop but it was only on some datacenters and then came back but now it's across datacenters so I'm worried it's a real penalty.
Another question about local rank: In a competitive industry, would it not be unusual to have competitors linking to each other? So if you take the first 1,000 results on a search term, let's say in a very competitive industry, if those pages are all competing companies, it would be unusual to find a lot of linking to competition going on. After all, if you're trying to sell insurance, you don't want to refer visitors to another insurance site. So, how would local rank be applied in this case? It would seem very few peer sites would be linking to each other.
I've heard many times that linking between competitors is not natural and to stay away from it. On the other hand, there will be a lot of related sites that aren't competitors within those top 1000 results like informational results and trusted directories, for instance, where it is important to have a link from because you want links from places that talk about your industry which should come up somewhere in the top 1000.
My site's rankings all dropped this past Monday Feb 5th across all datacenters. Did anybody else see this? It seems like everyone else was hit before.
My gut is telling me that this isn't really a penalty, it's an interactive effect of the way the Google dials have been turned in their existing algo components. It's like getting a poor health symptom in one area of your body from not having enough of some important nutrient -- even though you've got plenty of others and plenty of good health in many ways.
And that symptom is bad Local Rank?
If it's Local Rank then why would people be just fine for certain SERPs and dead in the water for others? I don't see why we would have more Local Rank inbounds for Red Widgets than we would for Green Widgets.
And even then, wouldn't we just slide down 50 spots, as opposed to 500 spots? I mean, that's a huge drop.
[edited by: ALbino at 11:24 pm (utc) on Feb. 7, 2007]
In one case I know of, the signs of this problem disappeared with one solid new inbound link from a very different domain, with the problematic phrase used as the anchor text. By "very different" I mean the linking domain was not in the 1,000 for the given search.
So, not less "SEO" fixed it, but more. The purely OOP assumptions don't sit right with me, given this anecdotal result. Now it's only one case, so it's certainly not "proof" of anything, but the fix has now been stable for several data refreshes at Google, so it is a decent data point to keep in mind.