Forum Moderators: Robert Charlton & goodroi
I have two urls that were 950'd on Friday. Unfortunately they are my two most requested urls (other than the home page) - they'd been #1 in the SERPS for at least three years, and now only show up on the last page for the most common search phrase. They both showed PR previously, but TBPR has been greyed out since the last update. However, in certain permutations of the search phrase, they they still rank #1. The search string usually comprises the city name and the event, and often includes the year.
Example:
city event - 950'd
city event 2007 - 950'd
event city - #1
city state event - #1
city state event 2007 - #1
As far as I can tell, it is ONLY two urls, out of around 500, that fell into this (so far, anyway)
What it all means, I have no idea.
[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]
Those sites that came back are higher than they used to be. In general, top 3 or 4 for search terms in a 100 million serp or less pool.
This might just be Google temporarily smiling on my endeavors, but the ranks have held for about 10 days. Keeping fingers crossed.
The changes: removed ALL cross-page nav links except one back to the top-level menu (think of it as index). These were good on-subject links; no mixing of themes or topics.
Can you give us an idea how many links you had on a page before you removed these, and how many you had after?
Also, to break it down a different way, how many pages were you linking to on an average before you made the change, and how many after?
Can you give us an idea how many links you had on a page before you removed these, and how many you had after?
The linking scheme was an all-encompassing single level: each page linked to all the others. The "index" was on the same level as all other pages. The index "WIDGETS" pointed to ABOUT WIDGETS, WIDGETS FOR CHILDREN, MAPS TO WIDGET DEALERS, HOW WIDGETS ARE MADE, etc for about 15-20 titles.
Each page had a nav menu which listed all the other pages, using link text that contained the keyword 'widget' in about 66% of the links. As I said, the nav menu was used at least 2X on each page: a side nav bar, and a footer.
After removal, the only link on each subject page was to the index WIDGETS. That one page continued to hold links to all the rest, of course.
On an average, I had about 20 internal links before, and only one after. I suspect it was not the ordinal number of links, but instead, the saturation level of links: it was pretty much 100%, everything linked to everything else.
Lest someone think I was mixing off-topic links: I wasn't. There were no links to any pages that were not 100% dedicated to the WIDGET topic.
This 950 penalty, in my case, was very centric to directories. It can easily target a specific directory while leaving the rest of the site alone.
Hope this helps a little.
Since then I change the navigation:
Silo structure, index points to main level categories
Each category points to sub category articles and home, but not across to other categories.
Fixed any url issues (there were not many)
Added new content slowly.
The site is now up at position 30 consistently, which is not where it was previously, but its workable.
My plan is to ask people to remove links to my site and do nothing to the actual content of the page. Hopefully it works.
Also, paring down internal linking so that each page doesn't have 20 keyword-heavy links comes under the heading of de-optimization; over-optimization being something that's been touted as a possible cause of this penalty.
I've split my old content over umpteen new domains, and heavily deoptimized it; smash it up and start again. The difference being, I know a lot more about SEO than I did when I started out years ago, so _hopefully_ recovery should occur in the long term.
I am still penalized, therefore take my advice with a pinch of salt.
And the two that are 950'd both used to be PR2, and are now showing page rank not available. Dunno if that means anything either. All I can do is throw stuff out there.
Disclaimer: Unfounded and not backed by anything concrete, I seriously suspect that there's some shifting and shuffling around going on regarding what's in the main index and what's in the Supplemental index, and which pages get PR and which don't.
Also covered by the "dislaimer" - it's starting to look more and more like a further look into the "partitioning" that's mentioned in the patents is worth looking into further.
Well, It's now July and we've experience our first 950 Penalty and thanks to this forum topic I've learned so much about how everyone has been dealing with it. Thanks for all your posts. Coming from a N00b on this topic.
This is our experience for a semi-niche electronic e-commerce site. We noticed the drop in rankings June 28th with not knowing why. I saw a complete loss of visits in Google.com/Analytics. Then confirmed with a ranking report the owner does each week. Originally I thought google had changed the algo and went scouring the forums for similar experiences. Nothing...Then I found someone complaining on the Google Webmaster Groups about rankings and joined in the co-misery. Finally over the weekend found out about 950 penalties and started reading this post. It's July 3rd and I finally got through it. Really...the continuation of this topic is amazing and important to any e-biz.
So finally we realized on Monday that our secondary, less visited site, which is marked differently, was possibly getting penalized for duplicate content. We us the same e-commerce platform on different servers using the similar product catalog with the same product write-ups. We've had this setup for almost two years with no problems. So we believe that it's possible we were reported by one of our competitors.
The intent was not to spam. everything we've done was white hat to the T. We do not over SEO our site. We follow all title, description, alt, header tags with original content. This has kept us in the top rankings for years with no problems.
business was good until...950 Penalty!
Now we completely removed all listings of our secondary site via Google Webmaster Tools, robots.txt and meta robots. Our secondary site is now removed but still functional. We also resubmitted our site with a big apology and explanation.
The big question now is. how long before we regain our original listings? Has anyone experienced a similar sequence of events? If so I would love to hear your story.
Thanks again to everyone attached to this topic. You've been a real help!
What are the key signs of a 950 penalty?
My sited tanked to oblivion for a specific search term, but retains 'some' rankings for a few internal pages on different terms.
It feels like I've been slapped for the search term I was trying to target.
What you describe (retaining some rankings for other searches) is the common experience.
After 301 redirect of the old domain to the new, rankings returned entirely a few days after redirect was cached. Hopefully same for you.
Here's some more data for anyone passing through.
The term I was trying to target the most was 'Widget Boots' (gunna get weird since I can't give away specifics here), which I can't find myself in the SERPS anymore for.
However, i DO rank for....
WidgetBoots
Monster Widget Boots
and a couple others i think.
The 'monster' part refers to something else I'm well known for, but obviously a much smaller search term (maybe 10 a month)
Notice cramming them together ranks me too... #4 I believe.
So, I'm sure it IS a penalty, or some sort of filter that I've triggered for the said term.
I'll be honest, I was out there getting links with that anchor text for the search term. Directory submissions, blogs, etc. However, no where near the quantity you'd think it would take.... like your typical spammer. Which makes me think this filter can be tripped very easily, and it would do people some good to start focusing on their site, instead of their links.
For changes - After the doo doo hit the fan, I put no-follows on all the links to my utility pages. Contact, About Us, Faq, Affiliate Program, etc. So far, nothing. Maybe I'll see something in a month.
I'm still curious about the affiliate program that my site offers. Usually, the user uses an image (banner) link. Of course the banner has an alt text attached to it, which also happens to have my target term (actually... it's a full sentance). I wonder if the spam filters somehow see that as problem, and 950'ed me for the term I was after.
Those affiliate links have been up on other websites for quite some time tho.
Thoughts?
A page is listed within the last few dozen results for a query, usually directly next to several other results that would seem likely to merit top 20 ranks for that query.
If you aren't there, you don't have a 950 penalty, you have some other issue.
(An exception would be if you have two other pages from your domain ranking for a query instead of the "right" one, in which case the 950 result won't show since only two pages from a domain get ranked.)
It's the easiest penalty to see. Just look at the end of the serps.
It's the easiest penalty to see. Just look at the end of the serps.
I’m sure it strikes everyone differently, but this is what makes it so bizarre; sites just plunked at the total end of the line. It’s a penalty for sure but its interesting that they make it so obvious; why not 1001? It’s like they want you, and all your friends, to know you have been put in time out.
You end up at the end of serps for many phrases. Usually those that are more competitive and you are optimized for.
Slightly different phrases you can still be #1 for. Very difficult to find any pattern, or reason. Perhaps some kind of random factor is applied to drive seo's mad and make them go find another profession.
You have to go to last page, and click on the "repeat the search with the omitted results included."
line at the bottom.
Then it will probably be on page 10, although sometime will show on first page of the "omitted" results.
I seem to find them in the "omitted" section when they are first hit, or if you've just had changes to the page cached. After a few days, it will probably show on last page of normal results.
As I see it the only way these sites in my area can beat the penalty is to reduce the number of times they use the keywords they target. What’s unique though in these areas is there are actually no keyword substitutes available in a thesaurus. I mean zero. Of the five main keywords used in my area a thesaurus will just point back to the other words. As an experiment I asked a half dozen people to suggest keywords or phrases to substitute. All referred back to the same keywords that Google is likely targeting for over-optimization. Personally I learned years ago that with some subjects it was almost impossible to write articles that didn’t have some highly repetitive keywords. Apparently though Google engineers weren’t particularly concerned with the collateral damage something like this could cause.
Bottom line is this penalty is utterly ridiculous in some areas. It’s just driving authority sites and many domains that barely mention the keyword to the top.
no keyword substitutes available in a thesaurus
I've seen the same kind of pages, and I suspect the -950 trigger, at least in some cases, may have more to do with co-occurrence of terms than it has to do with synonyms. Co-occurrence measures what other words would naturally "co-occur" in documents that contain the search term. If almost no naturally co-occurrent terms are present, or conversely if too many are present, that could be a flag. It's a linguistic sign that the content is potentially unnatural. Scraped content and auto-generated content can both have this type of footprint. Stub pages can too.
As a purely theoretical example, a page about "doctor's office" might be expected to have at least some terms like "blood pressure", "nurse receptionist", "prescription pad" and whatever. The exact phrases and their frequency of appearance can be measured across a large number of documents in each case, and standard deviations can be calculated. In a large collection of documents, this calculation would only need to be run periodically, not continually,
The absence of all (or almost all) expected co-occurring terms might be used to flag suspected attempts to manipulate Google on one particular keyword phrase -- unnatural stuffing, in other words. Or perhaps such pages would not hold enough peripheral information to be widely useful to end users in a search result.
As I said, this is only my suspicion. But I do see the same kind of phenomenon that outland88 is describing. And I've also helped pages improve their rankings by losing the intense SEO focus on targeted keywords in the copywriting, and allowing the content to breathe a bit more naturally.
I haven't had to work on any -950 pages with this approach so far (knock on wood here). But I have helped pages jump +40 places or so by giving them this kind of attention, and that makes me suspect that co-occurrence is a factor in play in the algorithm today. In linguistic and semantic study -- as well as in IR or information retrieval, the granddaddy of search engine technology. Co-occurrence is not exactly cutting edge. It's been knocking around for quite a few years.
Thanks again for all the posts in helping us understand the monster we refer to as G.
[webmasterworld.com...]
Be sure to read the Forbes article linked to in that post.
In reviewing previous comments in the -950 threads, it sounds like these are some of the key reasons why people are seeing problems:
- Over-optimized title/meta tags.
- Over-optimized anchor text in internal links.
- Over-optimized anchor text in external links (I have seen sites get as little as 10 links and get a sitewide -950 penalty).
- Duplicate content across different domains.
Are people seeing other reasons? I believe that even a handful of external links using optimized anchor text can push your penalty points over the threshold for the -950 filter if you have a young/small website (the site I killed today was from 2005 with under 500 pages). To get below the threshold, either remove the external links, or de-optimize your site enough so that it doesn't look like you are spamming a particular keyword or phrase.
errorsamac, were you working with links to the home page here?
Another thing is that after the external links started to appear for this latest site, Google sent the whole site to -950. The time the links started to appear and the time the site was penalized was between 6-12 hours. During that time, Googlebot did hit some of the pages of the site (including the main www.example.com page), but all pages on the domain (even ones that Googlebot has not visited recently) were sent to -950.