Welcome to WebmasterWorld Guest from 188.8.131.52
That's because we are shooting in the dark
We really aren't shooting in the dark. We are shooting at dusk. We can see a fuzzy image of what is out there. Sometimes when we shoot we hit the target and other times we can't no matter how hard we try.
But we waste our time when we start shooting at theories like
- Google is doing this so more people will pay for AdWords
- Google only hits commercial sites
- If you have Google analytics you will sorry
- Only sites doing something illegal are hit by -950
- It's because you have AdSense on your site
- Scraper sites are doing this to us
It goes on and on.
Is it because the phrase based theories are not an easy answer? It does take a lot of work to figure out why you might have been 950ed and sometimes you just can't find the answer. But I still believe that most 950ed pages have been caught in an imperfect phrase based filter.
[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]
No, there is no clear indication of exactly what's happening, but there can be no doubt that there's more than just one thing happening. It's very unusual that ONE thing all by itself can have such profound effects on such a broad level, affecting so many different types of sites of all different sizes across so many verticals.
In particular, I wouldn't take a chance on monkeying around with outbound links unless it's removing the ones that go to bad neighborhoods or very inappropriate sites.
Try putting a no follow tag on your recip links page and wait a week to see if it has an effect.
90% of the sites that are in the -950 penalty either were involved in link exchanges or have recip links pages. That is the common factor.
If a site has been penalized (site not page) for being link spammers, it's an entirely different issue -and it's a small minority of sites in the final results page, and they should have been caught long ago. Maybe they were, but nobody else noticed them MIA for what they were ranking for.
I challenge anyone to show us Walmart's reciprocal links page. Or Target's. Or JCPenney's. Or Amazon's link page. Or the Google Directory's link page. Or Bizrate, or Priceline or Ebay, or the dozens of independent sites that have ONLY one way inbound links and do not link out at all and have tons of other pages ranking well, including many top ten or #1 rankings,that have only SOME of their pages in the 400's and 900's. Redundant pages that should be clustered out of top rankings.
And the top ten results of a HUGE number of searches across multiple verticals most certainly DO have reciprocal links - many of them. And bought links too, but that's another issue. ;)
I would seriously suggest reading up what those papers have to say on the primary and supplemental index and how they're being partitioned. There might just be some little grain of truth in there that might be suspected to actually be happening, since the infrastructure of the secondary index did, in fact, under go a change and so did the procedures for indexing.
Check into things further before cheating legitimate, on topic link partners and removing quality outbound links, and risking losing those quality recips back that were helping the site to rank well when they find out they've been cheated. Then dig into your pockets and go find your friendly neighborhood link broker.
The latest change before that was in mid-March, but with an updated cache every 5 days, it seems unlikely that those changes would be the reason -- I expect it to go back to 950, but if it stays out even past the next cache, then that seems to be the tough choice for webmasters.
I'm thinking that too. Digging back into that patent that determined the "significantly exceeding" score for throwing a page into the spam_table must have either been tweaked to allow for more content (it is possible that my changes just happened to get spidered right after we made them, but I doubt it) OR the co-occurrence levels changed with the exclusion of some stupid documents that probably shouldn't be in the localset.
So, how do you "de-optimize" thousands of articles?
I don't even know what to do (and adding content didn't change anything).
Other than a snarky comment to Ctrl+F, Ctrl+R...that is an incredibly tough task that I don't wish on anybody. If de-optimization were in fact the cure to this crazy situation, then it'd either have to be on an article by article basis (starting obviously with the ones you view to be the most important) or to maybe just hit a few and let the rest age (hopefully attracting some more relevant links to what I'd assume is well-written content that naturally attracts links).
Many of the accounts of successful penalty removals also describe many actions taken at once. Some of those actions sound to me like they make the site worse for the visitor. And even though the -950 penalty went away in a few days, as steveb mentioned, Google may well have made the criitcal change, not you. Google is always tweaking away on their very complex algo.
Suppose you make 6 kinds of changes at once, and then the penalty vanishes -- but thousands of other sites also saw their penalties vanish in that same Google data set. Did you really help yourself? And if so, which of the 6 steps did it? Are you now afraid to do any of those 6 things? What if none of them really made the difference -- how can you tell?
If you can look at your page and see 100 plus occurances of the same term, and they're almost all in your anchor text -- well, sure, "de-optimize" that. But don't make another change at the same time.
I have no idea what is triggering this, but I am inclined to make no changes. Sometimes I view "de-optimization" as just another form of optimization.
That certainly isn't over optimization, is the page ranking for other phrases? And is the problem phrase by any chance an extension of a phrase that's OK, like word1-word2-word3 with word3 added on to word1-word2?
is the page ranking for other phrases? And is the problem phrase by any chance an extension of a phrase that's OK, like word1-word2-word3 with word3 added on to word1-word2?
Yes, the page ranks very well for other competitive phrases. What I am seeing is actually the reverse of the scenario you asked about. In this case, the problem phrase is 2 words; longtail phrases that include the problem phrase are not penalized.
For example, word1-word2 is sent to the omitted results section (clicking omitted results brings it back to #1). Whereas, word1-word2-word3 ranks very well across the board.
This is the strangest thing I have ever seen in G. Again,I am only seeing it on 4 or 5 DC's. The only thing I can think of is that the phrase (word1-word2) is overused in IBL's, but I have no control over how people link to me, especially scrapers and people who link to me using the title of my site.
For example, word1-word2 is sent to position 700+ and actually included in the ommitted results section (clicking omitted results brings it back to #1).
And are you seeing the same thing when your preferences are set at both 10 and 100 results per page?
Since reading this thread I am sitting still and watching. Of the three hundred or so pages on the site only three are still ranking and those are pages with sites linking specifically to those pages. Many internal pages brought traffic to the site before the drop, usually long tailed phrases. I believe two three word phrases are causing the problem. Those phrases are in the meta title tag of the index page and scraped by many including my welcome paragraph which is first on the page. Those phrases are also included in the welcome paragraph once and one internal link with them is on the page.
I took the phrases referred to above out of the meta title tag and renamed the links on the page. Problem was they didn't help in Google but fell out of the serps in other search engines, so instead of losing most of my traffic, I lost all of my traffic. I changed everything back a few days later. So we are back to where we were. I just know that through this thread or another like it, something is going to click that will help us all.
Filtered/Penalized/Phrased Out: "Blue Widget"
Working Fine: "Big Blue Widgets"
Did I miss something where the consensus on this has changed?
You are correct. The concensus did not change. I am on a learning curve here, and was posting my observations that happen to be consistent with yours (word1-word2 penalized; longtail phrases that include word1-word2 are doing fine).
When I first started reading here, I was trying to understand why this was happening. It simply doesn't make sense to me. Basically, a page can be 950'd for "blue widgets" but not for "big blue widgets". What is the point?
[edited by: crobb305 at 2:05 pm (utc) on May 16, 2007]
Our site is living off "big blue widgets" right now, and "blue widgets" doesn't seem to be coming back any time soon.
That's certainly what I suspect - it's the one and two keyword money terms (not three), that get filtered. As you may recall, the Florida update did this too - fortunately, it just didn't last as long!
Basically, a page can be 950'd for "blue widgets" but not for "big blue widgets".
This is one pattern people have been seeing but sometimes adding words to the search phrase doesn't help. Sometimes one word will do it but not another. So don't get too fixed on that exact pattern.
Here is what I'm thinking. Just the occurrence of a suspect phrase is not enough to send a page to 950 land. But the phrase brings about a higher standard of what is allowed in terms of optimization. So it may be that reducing repetition of a key word or any number of de-optimizations might keep the page from losing ranking. (or get it back in good graces)
One or two instances of a word (in a phrase or as a phrase extension) isn't enough to trigger a penalty, SO - is there enough richness of vocabulary on the page or site to substantiate that the page (or site) is relevant for the phrase?