| 4:54 am on May 14, 2007 (gmt 0)|
After a lot of thought, I don't think I'd be willing to make too many changes all at once. It might be just one very small, very correctible thing that's triggering a thumbs down on a page, and if multiple changes are made all at once there's no way to analyze the results or consequences.
No, there is no clear indication of exactly what's happening, but there can be no doubt that there's more than just one thing happening. It's very unusual that ONE thing all by itself can have such profound effects on such a broad level, affecting so many different types of sites of all different sizes across so many verticals.
In particular, I wouldn't take a chance on monkeying around with outbound links unless it's removing the ones that go to bad neighborhoods or very inappropriate sites.
Try putting a no follow tag on your recip links page and wait a week to see if it has an effect.
90% of the sites that are in the -950 penalty either were involved in link exchanges or have recip links pages. That is the common factor.
That takes the grand prize as being the most irresponsible advice that's been given since this whole 950+ thing started. Doing that could have VERY serious negative consequences down the road, not for any single page or phrase but for entire sites.
If a site has been penalized (site not page) for being link spammers, it's an entirely different issue -and it's a small minority of sites in the final results page, and they should have been caught long ago. Maybe they were, but nobody else noticed them MIA for what they were ranking for.
I challenge anyone to show us Walmart's reciprocal links page. Or Target's. Or JCPenney's. Or Amazon's link page. Or the Google Directory's link page. Or Bizrate, or Priceline or Ebay, or the dozens of independent sites that have ONLY one way inbound links and do not link out at all and have tons of other pages ranking well, including many top ten or #1 rankings,that have only SOME of their pages in the 400's and 900's. Redundant pages that should be clustered out of top rankings.
And the top ten results of a HUGE number of searches across multiple verticals most certainly DO have reciprocal links - many of them. And bought links too, but that's another issue. ;)
I would seriously suggest reading up what those papers have to say on the primary and supplemental index and how they're being partitioned. There might just be some little grain of truth in there that might be suspected to actually be happening, since the infrastructure of the secondary index did, in fact, under go a change and so did the procedures for indexing.
Check into things further before cheating legitimate, on topic link partners and removing quality outbound links, and risking losing those quality recips back that were helping the site to rank well when they find out they've been cheated. Then dig into your pockets and go find your friendly neighborhood link broker.
| 9:00 am on May 14, 2007 (gmt 0)|
Just as a side note: I installed a round robin DNS about 2 weeks before the 950, I went from two squids to three squids. The last squid was not on the same IP subnet as the two previous ones. Since I use Germany's biggest provider this could have collided with backlinks devalued if they are hosted with the same ISP, which many do. Additionally this speeded things up considerably, which probably means also a shortening of return times, but the user retention rate went up as did the pages per viewer. I have thrown now the third squid out and use it as a database server. Garglebot is now back pulling pages as if there is no tomorrow.
| 12:20 pm on May 14, 2007 (gmt 0)|
|I challenge anyone to show us Walmart's reciprocal links page. Or Target's. Or JCPenney's |
| 6:56 pm on May 14, 2007 (gmt 0)|
Interesting discussion. We had a large number of pages (50,000+) with internal links with similar, overly-optimized anchor text. We have been 950'd for some time. We removed the anchor text internal links months ago, but remained 950'd due to the large number of pages with the rougue phrases remaining in the index. Yesterday we submitted a URL removal request (ie, showarticle.php), 301'd the old links to a new url showinfo.php, and now we are out of the 950 filter already today. Of course it will take several weeks/months to get the new showinfo.php urls completely indexed, but this solution worked for us.
| 8:43 pm on May 14, 2007 (gmt 0)|
Thanks for share your story in this post, its very interesing.
I want ask you if the pages with the anchor you was removed was in the RS when you search for site:?
| 10:33 pm on May 14, 2007 (gmt 0)|
Quick recovery from -950
I listed my -950 url on a good paid directory and after 15 days my -950 is on the first page of google for the same search.
should i do the same for my other 200 urls at -950?
| 11:36 pm on May 14, 2007 (gmt 0)|
We severely de-optimized a site that was 950ed for a couple months now...it was completed the morning and is back already, which seems way too fast.
The latest change before that was in mid-March, but with an updated cache every 5 days, it seems unlikely that those changes would be the reason -- I expect it to go back to 950, but if it stays out even past the next cache, then that seems to be the tough choice for webmasters.
| 12:05 am on May 15, 2007 (gmt 0)|
So, how do you "de-optimize" thousands of articles?
I don't even know what to do (and adding content didn't change anything).
| 12:25 am on May 15, 2007 (gmt 0)|
I came back today too. I think there was some kind of update.
| 2:10 am on May 15, 2007 (gmt 0)|
I'm thinking that too. Digging back into that patent that determined the "significantly exceeding" score for throwing a page into the spam_table must have either been tweaked to allow for more content (it is possible that my changes just happened to get spidered right after we made them, but I doubt it) OR the co-occurrence levels changed with the exclusion of some stupid documents that probably shouldn't be in the localset.
So, how do you "de-optimize" thousands of articles?
I don't even know what to do (and adding content didn't change anything).
Other than a snarky comment to Ctrl+F, Ctrl+R...that is an incredibly tough task that I don't wish on anybody. If de-optimization were in fact the cure to this crazy situation, then it'd either have to be on an article by article basis (starting obviously with the ones you view to be the most important) or to maybe just hit a few and let the rest age (hopefully attracting some more relevant links to what I'd assume is well-written content that naturally attracts links).
| 3:26 am on May 15, 2007 (gmt 0)|
It seems to me Google is definitely getting better at not mistakenly applying 950 penalties.
They still are making mistakes for sure, but I'm seeing a lot being fixed. (And it has little to do with webmasters doing anything at all.)
| 5:06 am on May 15, 2007 (gmt 0)|
The right de-optimization may well be the fix in a lot of cases, if you can zero in on the exact troublemaker area. But I certainly don't recommend a scattershot approach. What is "de-otpimization" after all? I've read of people removing a single occurence keyword from the title or H1 element!
Many of the accounts of successful penalty removals also describe many actions taken at once. Some of those actions sound to me like they make the site worse for the visitor. And even though the -950 penalty went away in a few days, as steveb mentioned, Google may well have made the criitcal change, not you. Google is always tweaking away on their very complex algo.
Suppose you make 6 kinds of changes at once, and then the penalty vanishes -- but thousands of other sites also saw their penalties vanish in that same Google data set. Did you really help yourself? And if so, which of the 6 steps did it? Are you now afraid to do any of those 6 things? What if none of them really made the difference -- how can you tell?
If you can look at your page and see 100 plus occurances of the same term, and they're almost all in your anchor text -- well, sure, "de-optimize" that. But don't make another change at the same time.
| 10:39 pm on May 15, 2007 (gmt 0)|
I have an interesting situation. I have a page that has the 950 penalty showing for ONE phrase on about 5 datacenters. It was on just 1 datacenter a few days ago so it appears to be spreading. This phrase appears ONCE in my title and only ONCE on the page (an H2 title heading).
I have no idea what is triggering this, but I am inclined to make no changes. Sometimes I view "de-optimization" as just another form of optimization.
| 10:52 pm on May 15, 2007 (gmt 0)|
>>This phrase appears ONCE in my title and only ONCE on the page (an H2 title heading).
That certainly isn't over optimization, is the page ranking for other phrases? And is the problem phrase by any chance an extension of a phrase that's OK, like word1-word2-word3 with word3 added on to word1-word2?
| 1:34 am on May 16, 2007 (gmt 0)|
|is the page ranking for other phrases? And is the problem phrase by any chance an extension of a phrase that's OK, like word1-word2-word3 with word3 added on to word1-word2? |
Yes, the page ranks very well for other competitive phrases. What I am seeing is actually the reverse of the scenario you asked about. In this case, the problem phrase is 2 words; longtail phrases that include the problem phrase are not penalized.
For example, word1-word2 is sent to the omitted results section (clicking omitted results brings it back to #1). Whereas, word1-word2-word3 ranks very well across the board.
This is the strangest thing I have ever seen in G. Again,I am only seeing it on 4 or 5 DC's. The only thing I can think of is that the phrase (word1-word2) is overused in IBL's, but I have no control over how people link to me, especially scrapers and people who link to me using the title of my site.
| 1:44 am on May 16, 2007 (gmt 0)|
|For example, word1-word2 is sent to position 700+ and actually included in the ommitted results section (clicking omitted results brings it back to #1). |
That is totally wierd. You mean for word1-word2 you're 700+ for the phrase and after you click to include omitted results that same page is at #1 for that very same phrase?
And are you seeing the same thing when your preferences are set at both 10 and 100 results per page?
| 1:56 am on May 16, 2007 (gmt 0)|
Marcia, I just looked at the search again, clicked on the "Omitted Results" and I do not see the page coming back at #1 anymore; but, the fact that longer-tail phrases utilizing the phrase still rank well seems odd to me, at least compared to other experiences and what I am reading here.
| 3:38 am on May 16, 2007 (gmt 0)|
I've had this happen as well. For the phrases (two to be exact) that I ranked number 1 and number 3 for, every few days my ranking returns in the canada and australia data centers. I keep thinking it's like the page wants to return, but a filter is still pushing it away and it never makes a complete come back. The page I'm speaking of is the index page. I'm guilty now of making so many changes trying to get the page to come back that if it comes back, I won't know what was done to cause it (if anything).
Since reading this thread I am sitting still and watching. Of the three hundred or so pages on the site only three are still ranking and those are pages with sites linking specifically to those pages. Many internal pages brought traffic to the site before the drop, usually long tailed phrases. I believe two three word phrases are causing the problem. Those phrases are in the meta title tag of the index page and scraped by many including my welcome paragraph which is first on the page. Those phrases are also included in the welcome paragraph once and one internal link with them is on the page.
I took the phrases referred to above out of the meta title tag and renamed the links on the page. Problem was they didn't help in Google but fell out of the serps in other search engines, so instead of losing most of my traffic, I lost all of my traffic. I changed everything back a few days later. So we are back to where we were. I just know that through this thread or another like it, something is going to click that will help us all.
| 4:12 am on May 16, 2007 (gmt 0)|
I was under the impression that the specific problem is word1-word2 being 950'd and word1-word2-word3 working just fine. This is certainly how it works in our case, and how it was portrayed as occurring through most of this thread -- at least to my understanding.
Filtered/Penalized/Phrased Out: "Blue Widget"
Working Fine: "Big Blue Widgets"
Did I miss something where the consensus on this has changed?
| 6:14 am on May 16, 2007 (gmt 0)|
Albino- Yes, that is what I've seen. In case I've struggled with kw1 is -950. kw1 + kw2 is ok. Although it has varied quite a bit with different combinations popping in and out, the more specific terms have fared best.
| 1:54 pm on May 16, 2007 (gmt 0)|
You are correct. The concensus did not change. I am on a learning curve here, and was posting my observations that happen to be consistent with yours (word1-word2 penalized; longtail phrases that include word1-word2 are doing fine).
When I first started reading here, I was trying to understand why this was happening. It simply doesn't make sense to me. Basically, a page can be 950'd for "blue widgets" but not for "big blue widgets". What is the point?
[edited by: crobb305 at 2:05 pm (utc) on May 16, 2007]
| 1:56 pm on May 16, 2007 (gmt 0)|
ALbino, exactly what I see.
| 4:42 pm on May 16, 2007 (gmt 0)|
The reason "blue widgets" is penalized but not "big blue widgets" is the crux of this entire thing. Possibly "blue widgets" gets penalized for over-optimization or not enough IBLs with that as the anchor text, or several other reasons. The key point though is that "big blue widgets" is likely NOT penalized only because it's either A) not popular enough of a phrase to be affected by the phrase-based reshuffle or B) the other factors on your site are strong enough for you to rank for a more obscure keyword combo (PR for example). That might be why this seems to affect even established sites with high PR and high TrustRank, but not actually kill them all together.
Our site is living off "big blue widgets" right now, and "blue widgets" doesn't seem to be coming back any time soon.
| 5:13 pm on May 16, 2007 (gmt 0)|
> The reason "blue widgets" is penalized but not "big blue widgets" is the crux of this entire thing.
That's certainly what I suspect - it's the one and two keyword money terms (not three), that get filtered. As you may recall, the Florida update did this too - fortunately, it just didn't last as long!
| 5:17 pm on May 16, 2007 (gmt 0)|
|Basically, a page can be 950'd for "blue widgets" but not for "big blue widgets". |
This is one pattern people have been seeing but sometimes adding words to the search phrase doesn't help. Sometimes one word will do it but not another. So don't get too fixed on that exact pattern.
| 6:11 pm on May 16, 2007 (gmt 0)|
But like I said above, the phrase that is getting hit was used just once on the page, and once in the title. That is what is most interesting about this (as it affects my page). The single occurrence on page was in an H1 tag that was modified using CSS to reformat it's appearance. I removed the H1 yesterday, to see if it makes a difference.
| 8:02 pm on May 16, 2007 (gmt 0)|
crobb305, I see the same with my sites, they are not over-optimized (1 x keyword in title, 1 x in body) but index.html and all subfolders get hit with with 950 all the time.
| 8:18 pm on May 16, 2007 (gmt 0)|
Look at earlier 950 threads. I think it was Miamacs that said that it took only one incidence of a phrase to trigger the filter.
Here is what I'm thinking. Just the occurrence of a suspect phrase is not enough to send a page to 950 land. But the phrase brings about a higher standard of what is allowed in terms of optimization. So it may be that reducing repetition of a key word or any number of de-optimizations might keep the page from losing ranking. (or get it back in good graces)
| 8:54 pm on May 16, 2007 (gmt 0)|
There's a difference between a penalty for a phrase and simply not ranking for the phrase, and for all practical purposes 950+ is no worse than ranking #60, neither one will pull traffic anyway.
One or two instances of a word (in a phrase or as a phrase extension) isn't enough to trigger a penalty, SO - is there enough richness of vocabulary on the page or site to substantiate that the page (or site) is relevant for the phrase?
| 10:39 pm on May 16, 2007 (gmt 0)|
|So it may be that reducing repetition of a key word or any number of de-optimizations.. |
or even splitting the phrase words with in/of/and/etc or reversing the phrase
| 11:17 pm on May 16, 2007 (gmt 0)|
|is there enough richness of vocabulary on the page or site to substantiate that the page (or site) is relevant for the phrase? |
So are we back to taking the top 1000 results and then reparsing them using each other as what's now deemed "relevant" for a particular keyword combo and then displaying them in that new order?
| This 195 message thread spans 7 pages: < < 195 ( 1 2  4 5 6 7 ) > > |