Welcome to WebmasterWorld Guest from 188.8.131.52
It is totally nonsense for me to worry about TBPR when I was badly hitted from a -950 penalty (look! my PR raised in almost all pages ... but as I said who cares?).
Please all -950ers come here and join this thread to group possible causes.
Here are mine:
1) E-mail to Adsense team about an account creation with domain name
2) Too many adsense boxes
3) Midly Over-optimized pages
4) Too similar titles
5) Some directory links (as almost all my competitors though)
I add that in last months no big changes were done!
Join -950ers power :-)
[edited by: tedster at 9:08 pm (utc) on Feb. 27, 2008]
Title: "My Site Name - Small Widget"
Main Body Text:
Browse ¦ S ¦ Small Widget
Small Widget is made up of blah, blah, blah, blah
Small Widget with brown whozawhatzits from ThisCompany."
And that's it. I mean, there's a navbar at the bottom and some text info boxes, but nothing that in any way relates to the -950'd phrase "Small Widget" -- they're just site information. So is repeating a two-word combo 4 times on the page and 1 time in the page title really strong enough to trigger a -950 penalty? I just can't imagine keyword repetition is the cause... at least in this case.
[edited by: ALbino at 4:45 am (utc) on Dec. 8, 2007]
This -950 critter just loves URLs with a lot of links. It eats the links and punishes the URLs, so I think less links per page are a better solution, it also means less text per page which might reduce co-occurrance.
Also, the links all what I'd call spammy. They look like this:
"Product - The best www.example.com site on the planet"
Nearly every link has the phrase "The best www.example.com site on the planet" on the homepage. It's repeated in at least 60% of the links. You'd have to see it to believe it how miserable that looks but google loves it. Ranks #1-#3 for every keyword.
I noticed some sites seriously hit who had just that, and even got it down to a percentage limitation at the point when I'd seen enough of them. Sure, there are other factors in the "illness", but that's the most compelling symptom I saw on a good number of sites.
I removed a good chunk of our footer about 10 days ago, and noticed a handful of slightly competitive and semantically related terms show up in the top 10. Upon seeing this I took out the shotgun and completely removed the footer, and lo and behold everything is back!
What this tells me, which hopefully may apply to some of your situations as well:
1) Removing secondary navigation can help - this confirms what others have theorized regarding "link weight".
2) Changes to your rankings occur right after the next crawl of your site.
3) The filter is phrase-based - if you make a change and notice some pages coming back, those were likely on the edge of the filter and you just did something right.
Thanks so much to everyone on this forum for sharing your thoughts and ideas!
I completely removed the footer.
Did it have only links to internal pages? That would seem a bit drastic if so. I thinned mine, removing several internal links, including ones that had psynonym keywords, or repeated keywords from header navigation links, but I'm not removing all of them. Users should be able to find a link to the home page, if nothing else, in a footer, like this site and many/most others. I removed the keyword in the footer home link and replaced it with the neutral "home" text to avoid potential hassles from suspected "overoptimization."
P.S. I've suspected for a while the 950 lifts after regularly scheduled crawls, not reinclusion requests, etc.
Over the weekend, Google gave me the option to increase my crawl rate to "faster" - does this help at all with getting pages back? Is this a good thing? I chose it, of course, because Yahoo hosts my site and its not my bandwidth google is sucking :)
So if this is the case for others, then this may be a clue. I have a site that has two separate and unrelated sets of keyword phrases, and this Dec Google whack only affected 1 set of keywords, the other set still ranks great.
Anyone else see this?
I think many of us are targeting a very wide variety of two-word phrases that are only vaguely related of which some are affected and some aren't.
Matt Cutts says the 950 is an "overoptimization" penalty. So you could assume that's the primary issue. Keywords and keyword phrases obviously are the traditional targets of optimization.
The conclusion of many here is that overoptimization of keywords/phrases in internal linking, i.e., navigation keyword stuffing/navigation spamming causes the 950 penalty.
If you target many related phrases, and thereby use repetition of the keywords or phrases in your internal links, you can certainly expect to get hit with the 950.
Although it must be noted some sites doing that still haven't been squarely hit. However, that could change at any time with the turn of a dial, or some other questionable aspect of your site Google finds.
I was breaking the unwritten rule for months and then finally got the 950.
P.S. The only two- and three-word phrases that survived the 950, with few exceptions, are the ones for which I was #1.
The threshold seems to be based on the competitiveness of the phrase (in my experience at least).
I'd also add that different sets of semantically related phrases seem to have different thresholds - which may be why one set of phrases is 950'd and one set isn't. The threshold seems to be based on the competitiveness of the phrase (in my experience at least).
Good post. That would make sense b/c the most competitive phrases are most likely to get overoptimized.
I wish we had better insight into the phrase-based SPAM-detection patent.
Despite everything I've read here, I still have no idea which phrases Google will see as positive and which as negative; which are predicted to be found, and which will be seen as spam.
In the past, there was a keyword density issue we had to focus on. Now do we have to worry about keyword-phrase density?
Google, in recent months, seems to have divided its algo into at least two--one for single keywords and one for the rest.
Perhaps it will divide (or already is dividing) again, with one algo for keywords, one for keyword phrases, and one for all the rest.
It's always puzzling and interesting to read a lot of experiences here. I almost never see any SITES get 950ed anymore, so obviously keyword repetition in navigation has nothing to do with penalties to single pages, or several pages on a domain.
How about excessive number of occurrences of keywords in navigation links?
Especially when the target pages are very thin. What's the justification to an SE for many internal links full of keywords to virtually non-existent pages?
If nobody else backs the page (inbound links), it's unjustifiable and primed for a 950 hit.
From my count, I dropped for exactly 60 days. Hard to tell cause and effect, of course but I felt mine was a part duplicate content (I had content stolen from me), maybe over-optimization of the navigation (though I doubt it, it was never that bad).
Yesterday I was looking at the source code and my base href tag (or whatever its called) was gone since our redesign. I put it back in yesterday and today I'm back - makes me wonder if I had noticed it earlier would I not have been gone 60 days :/
On the plus side, it gave me a chance to get my PPC campaigns profitable so if it happens in the future, it's not a wipe out.
I put it back in yesterday and today I'm back
Do your logs show Google crawled your site in the last 24 hours? I'm not sure that issue/fix would lift the 950. Don't know that anyone else had a similar experience reported here. But fixes of all kinds are good for the Web.
Do your logs show Google crawled your site in the last 24 hours?
What's interesting is that, now, my WMT shows:
Pages from your site are included in Google's index. See Index stats.
Pages in your Sitemap(s) are included in Google's index. See Sitemaps overview.
Before, the second thing (pages from your sitemap....) wasn't there. It just showed up this morning.
<a href=...>winter's fury</a>
<a href=...>tiger's fury</a>
<a href=...>hell's fury</a>
<a href=...>fury of vengeance</a>
<a href=...>sound and fury</a>
<a href=...>full of fury</a>
Don't ask why. The site does have issues (it was built years ago, pre-SEO self-education) and does bounce occasionally for weeks/months on end.
Its PR5 may help push it back higher. Who knows. I'd noticed about a week ago when the home page was #81, one page had been 950d.
The PR juice is mostly from inbound links, which have been known to lift 950 penalties, so that could help explain the comeback.
I was getting ready to fix its spammy internal links, etc., but now may not bother if it holds.
In other Google Search News, I just saw an article site has been 950d! Now let's hope this is the start of the long-overdue Google Article Link Farm Spam Elimination Campaign. Instant experts be gone! 8->
I'm wondering, therefore, whether, like many Google filters, the level of competition is a factor... and that phrases with relatively low competition levels are below the threshold of the filters. A very sloppy measure of competition level on this particular set of phrases puts pages returned containing exact phrase matches at between 50K to 150K matches, not super competitive.
Anyone else relating this apparent anchor text filter to the competition levels? I'm guessing it might kick in at higher page counts. Google counts have been so all over the place, though, that it's hard to trust the numbers.
1. The re-ranking is triggered by crossing a threshold.
2. The threshold can be different for different search terms.
3. The threshold can be different for different markets or website taxonomies.
4. The threshold is set by measuring and combining many different types of
mark-up and grammatical factors, and not by absolutely measuring any
5. The threshold is NOT set absolutely across all web documents. So phrases
in the travel space can be held to a different measure than, say, phrases
in jewelry e-commerce.
As has been discussed, it's becoming increasingly important to anticipate these levels or triggers, whatever they might be.
[edited by: Robert_Charlton at 7:43 am (utc) on Dec. 18, 2007]
You don't know:
1. If/when you'll be 950d.
2. If/when you'll be un-950d.
3. If/when you'll remain un-950d.
All the while you have to guess what you've done wrong, what corrections will work. It gets really silly. How do you know that your old site will be back to normal by the time you've got a new domain and site up and running?
I'll keep the old one up, but if the Google Spam Gestapo refuses to explain what it's doing, how much more time does it expect me to waste p*ssing around, fiddling, revising, whatever.
What great plans does Google have for 2008? Ten new 950s? Who's the knucklehead who came up with the 950 in the first place? He or she will have to hope karma doesn't bite.
It is knuckleheaded because its implementation is incompetent.
"Let's penalized pages that deserve it" is not bad.
"Let's make a penalty that we totally suck at implementing" is bad.
Yeah, the unpredictable bouncing into and out of the 950 is making me lean towards a brand new domain and new site.
If it is, indeed, something onsite, then the problem is still there.
To answer your previous question about "whether the logs show google hit my site" in the recovery on the 14th. WMT eventually showed that Google hit my site on the 14th (recovery date) and again on the 18th (the date of most recent demise).