Welcome to WebmasterWorld Guest from 54.90.204.233
Forum Moderators: Robert Charlton & goodroi
Something is now becoming clear, and I think it's time we put away the name "950 Penalty". The first people to notice this were the heaviest hit, and 950 was a descriptive name for those instances.
But thanks to the community here, the many examples shared in our monthly "Google SERP Changes" threads and in the "950 Penalty" threads themelves, we now can see a clearer pattern. The demotion can be by almost any amount, small or large -- or it even might mean removal from the SERP altogether.
It's not exactly an "OOP" and it's not the "End of Results" penalty. From the examples I've seen, it's definitely not an "MSSA Penalty" -- as humorous as that idea is. (Please use Google to find that acronym's definition.)
It's also not just a Local Rank pheomenon, although there are defiitely some similarities. What it seems to be is some kind of "Phrase-Based Reranking" - possibly related (we're still probing here) to the Spam Detection Patent [webmasterworld.com] invented by Googler Anna Lynn Patterson.
So let's continue scrutinzing this new critter - we may not yet have it nailed, but I'm pretty sure we're closer. The discussion continues:
[edited by: tedster at 9:18 pm (utc) on Feb. 27, 2008]
So...I have no idea if this is something I've done, or something Google's done. I'm anxious to see how everyone else does with their sites now.
The big question, of course, is will it last or is this Google messing with our minds?
How's it going?
[edited by: tedster at 9:06 am (utc) on Feb. 8, 2007]
My site got hit with the 950 Penalty about the same time as everyone else, I think it was sometime in November-December? (I'm not where I can look back to see right now.) There are mentions of it in older threads here at WebmasterWorld.
I haven't made any major changes to my site, just going over it looking for anything that could possibly be questionable. I did remove keywords from outgoing links to other sites, and tried to reduce the number of keywords in the text, but I didn't change much because it reads fine as it is, and to change words around would make it seem strange to site visitors, so I left most of them as they make sense.
I can't really explain it, but I really feel the issues are not so much on the pages hit, but on the site as a whole, and there's no explanation as to which pages/search terms are picked to hit.
This could be another bubble like we had last month, where all rankings were restored, and then they hit bottom again 48 hours later. If things seem to stabilize for a couple of weeks, I'll feel better. But I'm still going to go through my site looking for anything questionable and fix it. I'm sick of being at the bottom, below all the garbage.
My site is now back in the main index, since Feb 1, I deleted approx. 20 internal links (total of 90000) fron each deep pages (.com/#*$!/#*$!/#*$!.htm) that linked beetween them, only kept on these pages that link to the page up (.com/#*$!/#*$!.htm)
Imagine if all the webmasters could do this on all their deep pages, it will be more easy for the robots! And I do not see the pertinence of all these links for the visitors.
AndyA said:
And I guess I don't understand how a link from an unrelated site could be beneficial. Didn't Google just roll out anti-Googlebombing to put a stop to this? So if you have a totally unrelated site about hair rollers, and it links to your Widget site, assuming a Widget has nothing to do with hair rollers, how could that be helpful?
This is very interesting to me because I think one of my sites has a lot of backlinks from theme-related pages but on unrelated sites. If Google applies this Local Rank by taking the top 1000 results and then only counting the backlinks that occur within those results and then applies the anti-googlebombing algo which may also discount links from unrelated sites then a lot of our links would be discounted.
Our rankings dropped for all search terms that we were ranked high for but some that we weren't ranked high for did rise a bit but still aren't anywhere to bring traffic. Earlier we had a drop but it was only on some datacenters and then came back but now it's across datacenters so I'm worried it's a real penalty.
Another question about local rank: In a competitive industry, would it not be unusual to have competitors linking to each other? So if you take the first 1,000 results on a search term, let's say in a very competitive industry, if those pages are all competing companies, it would be unusual to find a lot of linking to competition going on. After all, if you're trying to sell insurance, you don't want to refer visitors to another insurance site. So, how would local rank be applied in this case? It would seem very few peer sites would be linking to each other.
I've heard many times that linking between competitors is not natural and to stay away from it. On the other hand, there will be a lot of related sites that aren't competitors within those top 1000 results like informational results and trusted directories, for instance, where it is important to have a link from because you want links from places that talk about your industry which should come up somewhere in the top 1000.
My site's rankings all dropped this past Monday Feb 5th across all datacenters. Did anybody else see this? It seems like everyone else was hit before.
My gut is telling me that this isn't really a penalty, it's an interactive effect of the way the Google dials have been turned in their existing algo components. It's like getting a poor health symptom in one area of your body from not having enough of some important nutrient -- even though you've got plenty of others and plenty of good health in many ways.
And that symptom is bad Local Rank?
If it's Local Rank then why would people be just fine for certain SERPs and dead in the water for others? I don't see why we would have more Local Rank inbounds for Red Widgets than we would for Green Widgets.
And even then, wouldn't we just slide down 50 spots, as opposed to 500 spots? I mean, that's a huge drop.
[edited by: ALbino at 11:24 pm (utc) on Feb. 7, 2007]
In one case I know of, the signs of this problem disappeared with one solid new inbound link from a very different domain, with the problematic phrase used as the anchor text. By "very different" I mean the linking domain was not in the 1,000 for the given search.
So, not less "SEO" fixed it, but more. The purely OOP assumptions don't sit right with me, given this anecdotal result. Now it's only one case, so it's certainly not "proof" of anything, but the fix has now been stable for several data refreshes at Google, so it is a decent data point to keep in mind.
Getting links from reputable and relevant sites is usually a good idea, right? It proved to work out well in this one case.
Well it certainly can't hurt, but we're talking about trying to find thousands of inbounds from related sites. That's no easy task.
could this be related to over-optimizing inbound anchor text
I thought it couldn’t be on the missing pages as they have very few inbound links other than scraper links. Most of their PR comes from a very strong homepage that is inbound linked widely in the field.
It has finally occurred to me that with few inbounds and so many scraper links it may look like over-optimizing inbound anchor. In other words could the scraper sites be causing an overoptimization situation on these pages? They usually have identical anchor text as they just take the page title.
Nick, I’m so glad I re-read what you said today. You may be on to something here. Google has no way of knowing we didn’t solicit those scraper links. So pages with few inbounds could be hurt by several scraper links.
No, and again generalizing isn't helpful here. Some pages don't rank for anything. Some do rank for some terms but not others.
In all cases though a page is penalized. The way that page is penalized though can be varied (like either total death or only partial death).
Those are NOT merchants selling products, they are not even decent affiliate sites with original "pre-sell" content. They are automated, cranked out pages with OPS. Where's the value added?
Will someone please tell me what the "compelling content" there is on those sites?
Here's some general advice:
A) Use the new sitemaps link tool to identify if one particular page is being linked to from a ton of other pages on one single site. This can cause an inbound anchor text OOP.
B) Look at title, meta desc, h1's and body content - run checks on keyword density - you may well find that mixing things up, removing mentions of the terms or seperating the keyword combinations combined with rewriting the title and meta desc will do the trick.
Granted, I tried this just a few days ago on one other page which has been hit for about 6-7 months so I'll let you know how that one goes, and this page that has just been released may not stick around. But for now, it seems to have done the trick. I've had success with one other page in the past, with similar principles - general deoptimisation of onpage factors. It hasn't been ranking as high as it used to, but it's better than having zero traffic to it at all.
Here's some general advice:
A) Use the new sitemaps link tool to identify if one particular page is being linked to from a ton of other pages on one single site. This can cause an inbound anchor text OOP.
I think you mean internal anchor text OOP.
Anyways, I had a site drop to 950 for 5 days or so, then it came back nothing changed.
Another site had a page drop, with all the other important rankings unchanged, then we removed it from the navigation menu and it came back almost overnight. Put it back in navigation and it dropped out almost overnight as well.
Third case scenario we created a new sub-domain. Mostly unrelated but for the demographic it makes sense. The 40 or so pages did great out of the box. 4 days later it went 950+. I'm not going to make any changes just yet, but will get 20-30 inbound links (low to medium quality) over the next 15 days to see how it goes.
Oh, and of note on that third case, it dropped about 1.5 days after we added a script that linked to one of the 40 pages from every 3rd blog post on the main domain. That was about 900 links or so. It used the same anchor text as the anchor text used on the sub-domain's navigation. Now, we've switched it to use each page's H1 as anchor text.
We'll see.
[edited by: tedster at 9:12 am (utc) on Feb. 8, 2007]
The inbound link OOP is probably more common and the whole "vary your anchor text" rule is very important to avoid this.
I'd say the best bet for most people is to completely deoptimize pages in terms of onpage SEO. If and when your ranking comes back, you can always go and slowly start tweaking it again.
[edited by: Nick0r at 1:52 am (utc) on Feb. 8, 2007]
Some kind of last-step that re-rank the original results.
And could this last step be related to the patent on detecting spam though phrases as discussed here? [webmasterworld.com...]
Will someone please tell me what the "compelling content" there is on those sites?
The hopeful part of this happening is that maybe Google will see it isn’t working and change the algo balance or whatever it is that is causing this problem.
I pulled out a few key sections.
(Note: "good phrase" here means topically predictive)[0218] ... If there are a minimum number of good phrases which have an excessive number of related phrases present in the document, then the document is deemed to a spam document.
[0223] If the document is included in the SPAM_TABLE, then the document's relevance score is down weighted by predetermined factor. For example, the relevance score can be divided by factor (e.g., 5). Alternatively, the document can simply be removed from the result set entirely.
[0224] The search result set is then resorted by relevance score and provided back to the client.
In addition, the inbound anchor text scoring described in the patent could boost the document out of the danger zone if even a single new IBL shows up.
You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment.
The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report.
[edited by: tedster at 8:18 am (utc) on Feb. 8, 2007]
The hopeful part of this happening is that maybe Google will see it isn't working and change the algo balance or whatever it is that is causing this problem.
I'm not sure Google's criteria for "working" will be the same as the individual website owners'. They're more likely to look first at measures of end user satisfaction based on what urls are included in the top ten, not by what is excluded.
However, part of the Google team does look to minimize false positives and the like, so there certainly is hope.
[0223] If the document is included in the SPAM_TABLE, then the document's relevance score is down weighted by predetermined factor. For example, the relevance score can be divided by factor (e.g., 5). Alternatively, the document can simply be removed from the result set entirely.[0224] The search result set is then resorted by relevance score and provided back to the client.
In addition, the inbound anchor text scoring described in the patent could boost the document out of the danger zone if even a single new IBL shows up.
It's obviously phrase based, and while there may be more factors involved, that still remains a likely suspect. It takes analyzing and isolating the particulars, which is no easy task, because it's complex - no doubt by design.
What imho is of a major concern is what looks to be the possibility of something tripped by usage of anchor text/headings/titles - because of not only the "normal" scraper pages, but some that are doing even more than just duplication of strings of text in anchors, and are now playing around with swiping full pages.
[edited by: Marcia at 6:55 am (utc) on Feb. 8, 2007]
Out of thousands of pages on our site one example page that's been hit only has 7 internal links on the entire site. I imagine this is true for most of the pages that have been hit (5-15 internal links to those pages).
What am I missing? Outside scraper links I suppose?
Alternatively, the document can simply be removed from the result set entirely... The search result set is then resorted by relevance score and provided back to the client.
I wonder if this might also be related to the growing number of result sets I've been seeing that stop short of what the number of results promises.
The 950 penalty for example is clearly a penalty of a very specific, easily identifiable type.
It is also quite clearly applied deliberately/correctly, and also very often mistakenly (and toggles on and off).
The 950 demotion is not "almost any amount" obviously. It's to a specific point at the end of the results, where all pages effected by it are grouped together, so lets not wave our arms out of a desire to make one penalty cover everything.
I'm willing to back out of the "threadjacking" as you call it, if evidence accumulates that there really is a 120 penalty, and a 348 penalty, and a 950 penalty, and they're all different. We're not publishing a book here after all, we're having an ongoing discussion among many people. Right now it really looks to many like all these ranking demotions are just the many faces of one basic phenomenon, and the old label just didn't cover the discussion properly.
If someone wanted to throw the -31 penalty into the pile, I'd say no, that one's clearly different.