homepage Welcome to WebmasterWorld Guest from 54.167.173.250
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 59 message thread spans 2 pages: 59 ( [1] 2 > >     
The New 950 Penalty?
bloard




msg:4508156
 3:14 pm on Oct 15, 2012 (gmt 0)

I have posted within a few other topics that there seems to be an increased use of the "end of results" penalty by Google in recent months.

I had a couple of sites that suffered from 950 several years ago when it was a hot topic around here. Eventually they recovered... I think due to a few high quality links being received.

It looks to me like a lot of the recently hit EMD's are at the end of results, but also a lot of other sites that appear to be of some quality with decent content and a pretty good link profile. I'm also seeing obvious spam sites down there as well.

I have two sites presently at the end of results for a bunch of keywords. This pre-dated the EMD update. I have spent a lot of time over the past few months studying what these pages have in common. I can best describe it as follows:

If a site would ordinarily rank in roughly the 40th position or better for a given search, then results that would have appeared within the top 40 are sent to the end of results.

This also gives a false sense of belief that some pages that aren't in 950 are stronger than those that are. In fact, the pages at the end of the results are the strongest, but the penalty overlay sends them to the back.

I would like to again begin a discussion of what these sites have in common, or what options exist for recovery.

For those of you new to this penalty... set your search preferences to show 100 results. Then go to the last page of displayed results (page 8, 9 or 10 usually) and look at the pages that are there.

I am starting to believe that this is more of a catch-all penalty for sites that google just doesn't want to rank - although the algo would otherwise rank them. One example is that for several years Google Books results have stayed at the end of results in my niche... because they probably rank well per the algo given the authority of the google domain.. but aren't at all what the user is looking for.

 

bwnbwn




msg:4508208
 5:54 pm on Oct 15, 2012 (gmt 0)

A 950 is the death sentence and your better off moving on.

conroy




msg:4508210
 6:03 pm on Oct 15, 2012 (gmt 0)

Yes, I've got some sites in this exact position. One ranked at the top for nearly 10 years. It's still the best site in its area. I don't know what the problem is. There's no manual penalty on it. It wasn't touched by panda.

MrSavage




msg:4508211
 6:11 pm on Oct 15, 2012 (gmt 0)

I'll be completely honest here without shame of sound like a total newb (which I am not). Until last week I didn't one time consider going to the end of results to see what's there. Little did I know that's where a couple of my sites have ended up for those terms (oddly enough) that I was ranking for in the past. Needless to say this enlightenment has proved valuable to me. It offers some transparency in what to do next. Dumb down SEO for that site and or submit a reconsideration request in the future. I still don't know if a site that's there MUST have a reconsideration request made for a bounce back or if after some work that the site can return to where it once was (or at least close).

If this is indeed a death sentence, then please back that up with something. I personally don't give up on anything unless you are talking about an absolute pile of S that I put no effort into. If people are defeatist about this situation then advise what affords you the expertise to get us to buy into dumping our work and moving on. I think in general people who fail at something suggest it can't be done but people with success say anything is possible. I'm in that second camp.

gehrlekrona




msg:4508219
 6:38 pm on Oct 15, 2012 (gmt 0)

My main site, no EMD, was hit at around the Same time with a 950 penalty. My best guess is that it is a duplicate penalty, link penalty and an over optimizing penalty. I have been trying to get out of it but so far nothing has helped so I am looking at anchor text, page duplication, thin content and so on... I have requested a reinclusion but they say I have no manual penalty so its got to be on page/on site penalty.

bwnbwn




msg:4508221
 6:47 pm on Oct 15, 2012 (gmt 0)

The 950 has been around since 2006-7 maybe before. There has been a great deal of discussion years ago if it was a filter or penality. If you have the time by all means go for it. It is usually either over optimized or the algo has taged it as a spam site. I posted what I would do move on because more than likely will take more time than I could afford to work out of this one.

MrSavage




msg:4508223
 6:54 pm on Oct 15, 2012 (gmt 0)

Fair enough. I've probably had a site sitting there for the better part of a year without realizing it. I've done a lot to this site lately, so I'll report here if there is movement on that one. Call it my test site!

bwnbwn




msg:4508228
 7:03 pm on Oct 15, 2012 (gmt 0)

I can give some insight into this based on 5-7 years ago as it has been so long ago I really don't remember the time. I had a site hit with this one it was a manual review that got the site knocked. I am 100% sure of this.
It took 2 years to come out of it back then. With the new changes in Google I am afraid to guess the time now.
Mrsavage it would be a very good test to keep tabs with. All the new algo changes might just help in the speed of recovery, I really have no idea just repoting how long it took me to get out from under it.
Back then went from 950 to 60 to 30 to out.

lucy24




msg:4508230
 7:11 pm on Oct 15, 2012 (gmt 0)

OK, let me ask the real newbie question:

950? What kind of petty neener-neenerness is that? If they're mad at you, why don't they simply kick you right out of the top 1000 and promote someone from the other 999,000?

tedster




msg:4508240
 7:36 pm on Oct 15, 2012 (gmt 0)

The so-called "minus 950" may not be a subtraction at all - not like a minus 30, for instance. Rather it may be a multiplier that is less than 1, applied to the original rank calculations. As such, it is merely a re-ranking mechanism that can be applied in many cases rather than just one "penalty".

This kind of re-rannking is a method that Google mentioned in patents as far back as 2003 at least. It takes a preliminary set of results and re-ranks just that subset (1,000 or less) of all SERP candidates. It re-ranks them either by adding/subtracting a set value or using a multiplier (see New Google Patent - About reranking results [webmasterworld.com].

In the early days of the -950 phenomenon, even though many saw their URLs drop to the very end of the results, others noticed that they dropped a lot less than that, even though the drop was effectively enough to remove all traffic.

This is a very useful method for Google and I assume it's matured somewhat. Here's how I see this mechanism. Many common queries already have an established set of candidates for the final SERP. This is done to speed the results to the final SERP, since computing everything "on the fly" would take a lot of overhead. I'd bet that any query that appears in Google Suggestions, for instance, already has prefigured ranking candidates and their preliminary scores all neatly stored away somewhere.

-----

Now those candidates get re-ranked for the final SERP, and this is where the subtraction/addition/multiplication of the preliminary ranking happens... at the last minute. By doing things this way, the original computational work of combining all 200-plus factors is not lost or obscured, it just gets modified at the last minute.

Now when one of these factors gets changed, it can be removed altogether or it can remain in place but be made more or less severe. This clearly can be done by an algorithm or a manual action.

So what we have, IMO, is not a "new -950 penalty" per se but an old re-ranking method being applied in many contexts, some of them rather new. The re-ranking factor can be removed all at once or modified gradually. So it's certainly not necessary for a URL to climb back up the ranking positions one step at a time.

There may even be many re-ranking factors in place for a single URL. Adding and multiplying are very fast computationally, so folding in a bunch of last minute re-ranking would be not a big deal. And for very high volume queries, even re--ranked results could be cached and just re-calculated on a schedule (hourly or whatever) to save even more computation overhead.

-----

My main point is this. No matter how low you are currently ranking, it is not a cause for hopelessness - especially if you used to rank well. The fact that your URL is still in the result set anywhere at all is a positive sign in a very negative situation.

bloard




msg:4508248
 7:52 pm on Oct 15, 2012 (gmt 0)

I very much agree with your point about being at the end of the result set not being "all bad". In fact, my observation that for targeted keywords on my sites it is impossible for any of those pages to rank better than 40ish before getting sent to the back of the room - no matter how specific the query - is a sign of hope. My highest ranking pages are at the end of the results. If the re-ranking overlay can be improved, there is light at the end of the tunnel.

I've watched keywords climb from the 60's to the 50's and right when they get on the verge of breaking into the top 40... they drop to the end. Which tells me that without the re-ranking they would be somewhere near the top.

SevenCubed




msg:4508256
 8:14 pm on Oct 15, 2012 (gmt 0)

My main point is this. No matter how low you are currently ranking, it is not a cause for hopelessness - especially if you used to rank well. The fact that your URL is still in the result set anywhere at all is a positive sign in a very negative situation.


I have to agree with that. I've said before that for my own personal site I don't want it ranking high because I wouldn't be able to handle the added attention. Seriously it's been over a year since I checked for my site in results for the over-optimized term. So for the sake of this thread I caved in and went and had a look. For a primary highly competitive term that the majority of the 500+ competitors of mine want to rank for I'm in position -450.

Yet, if google considered my site to be poison over-all then they probably wouldn't rank me on page #1 results for 3 secondary terms that as highly competitive as the one for which I'm pushed down. Those 3 terms are not over-optimized.

I have to agree with tedster on this one. Don't look at it as being a hopeless case. I'm sure it can be recovered. In my case I simply don't want to recover it or have it on page one for the suppressed search term.

mhansen




msg:4508286
 9:19 pm on Oct 15, 2012 (gmt 0)

The fact that your URL is still in the result set anywhere at all is a positive sign in a very negative situation.


Don't look at it as being a hopeless case. I'm sure it can be recovered.


We have a site that was moderately impacted by the April 24th Penguin visit and has shown no signs of coming back to life for those same queries as of yet. We feel the same -950, or, end of results impact for certain sections of the site.

Background: Site (180 pages total, several years old) still gets OK G* traffic, just not the "core pages" (aka main menu). Adjusting for seasonality, we're down about 35-40% from the April 24th effect, year over year. We had 5-7 main sections (core pages) that lost ranking during the update in April.

User metrics for this site are very good in my opinion, and once we get the visitors to the site, they seem to stick around and find the info they were looking for.

- 3.73 pages per visit
- 3.16 Average time on site
- 39.29% bounce rate
- 28.80% returning visits
- +750,000 pageviews in 2011 (added for scope)

The pages that were hit the hardest, also the same pages that were our most popularly linked, visited, etc... are generally found on the last page of results, "after" you click the link that says "in order to show the most relevant results, we have omitted some of the results... blah... click here for all results"

The site is not heavily backlinked, and a majority of the backlinks are from scraper info-sites like ehow, diy'er sites, and similar how-to type sites. I feel our biggest opportunity is in higher value backlinks, but it's also the weakest part of our business, building authority backlinks, and truly, we're kinda gun-shy right now, being kicked in the teeth leaves an impression.

We've done quite a lot to try and recover those pages (onsite changes, eliminated what we considered low quality backlinks), so it's no shock to me that we may have tripped other filters, and those core pages that used to drive traffic are still back in the last page results.

In a related observation, other pages on our site DO rank for the same phrases our older content does NOT rank for any longer. Different pages can be found in the top 30.

Anytime we publish new content, it can be found in the G* index for 2-3 days on our sitemap page, before the actual page ever shows in serps.

We've not given up yet, but it's hard to devote resources to the sections of our site that made it so popular (+1 million visits in 2011), yet deliver nothing for visitors anymore.

MH

SevenCubed




msg:4508288
 9:28 pm on Oct 15, 2012 (gmt 0)

My site's -450 is due to onsite over-optimization -- I have very few backlinks but the few that I do are also anchored with the over-optimized keyword1 keyword2.

Donna




msg:4508293
 9:51 pm on Oct 15, 2012 (gmt 0)

I had one site at -950 for long time till I tried to de-optimize the OnPage SEO, it recovered in less then a month afterwords.

conroy




msg:4508317
 11:13 pm on Oct 15, 2012 (gmt 0)

Donna, that's helpful, thank you.

I think a lot of SEO people, especially those of us who have been at this for a long time, are having a hard time recognizing on page overoptimization. A lot of the normal seo on page things we did for so long were the "correct" things to do and it's hard to break a 10+ year old way of thinking.

I recently heavily deoptimized one -950 site and we'll see what happens.

Did you deoptimize the entire site or just the home page or what percent of pages?

ronalds8




msg:4508327
 12:33 am on Oct 16, 2012 (gmt 0)

+1 to Donna, it is possible to recover!

tedster




msg:4508337
 3:01 am on Oct 16, 2012 (gmt 0)

In a related observation, other pages on our site DO rank for the same phrases our older content does NOT rank for any longer. Different pages can be found in the top 30.

Thanks for that observation, @mhansen. So at least in your case, the problem is page-specific and not keyword-specific.

Do you think the demoted URLs being a "main menu" page is part of the issue? If so, does your main menu use text links that are loaded with strong keywords? Also, how many links are part of the menu?

SevenCubed




msg:4508338
 3:14 am on Oct 16, 2012 (gmt 0)

There's a chance we might even be able to break this into 2 parts. Maybe you get -450 for onsite over-optimization and another -450 for offsite over-optimization. If you are over extended in both cases you go wheeeeeeeeee all the way to the end of the line?

tedster




msg:4508353
 4:42 am on Oct 16, 2012 (gmt 0)

I'd say there are probably many potential components and the amounts of demotion probably fall across a wide spectrum, not just two pieces.

snorkeler




msg:4508371
 7:09 am on Oct 16, 2012 (gmt 0)

This happened to us last year but did not have a name for it and had no clue what to look for, but to start at the beginning. It was duplicate content (bad coding) and took 3 months to recover.

Got hit again this week and the timing could not get worse. Have not lost traffic because of the long tail search but lost rank in the top keywords which we were on the top. It could be over optimized, dup content or anchor text? Who knows, but more experienced at this than last year.

mhansen




msg:4508413
 12:16 pm on Oct 16, 2012 (gmt 0)

Do you think the demoted URLs being a "main menu" page is part of the issue? If so, does your main menu use text links that are loaded with strong keywords? Also, how many links are part of the menu?


We did think that in the beginning Ted.

We have two, top-level hierarchical text-based menus on the site. The "Main" menu, stretching horizontally across the top of the content. (between header section and content body) A second, "Sidebar menu" (just our name for it) near the bottom of the sidebar. Both menus are similar, but not exact, and rarely on the same screen together.

The main menu is 8 links total, linked to the main site sections, the original text links were an example of: home, widget types, widget brands, widget buying guides, widget rebates and special offers, widget reviews, widget faq, etc.

The sidebar menu contains links to those main sections, as well as additional support sections for the site. Our forum, Customer support pages, Office location information, etc.

Over optimization was the first thing we tackled in April. We made several changes to menus first, and then approached it at the page level, and looked for ways to scale back the content optimization without hurting the user experience.

On the menus, we removed the repeated occurrences of "Widgets" basing that on the fact the site was 100% about those Widgets, and Widgets are a part of the domain name. Thus we went from "widget brands" to just "brands", "widget types" top just "types" etc, however we did keep the a-titles at the full name (widget brands), unchanged from the original.

We also changed the URL's to remove repeated occurrences of "widgets". Originally, we 301'd all old urls to the new pages, and after we reached out and asked Webmasters to update their links, we eventually dropped the 301's completely. We monitored logs (and webmaster tools) for 404's and when spotted, updated those links immediately.

A few months after we made the majority of changes to the site, you (Tedster) posted about the G* spam detections and rank modifying patent (link [webmasterworld.com]), and we think we may have been a great candidate to prove that theory for G*. :/

Other Notes -

The pages that are penalized to end of results do still have PageRank, and the PR still changes on toolbar updates.

In alternative custom-search engines like Foxstart, (sites that use G* custom search) the pages rank in the top positions, similar to their original rankings, and the snippets reflect current changes. The same pages are those which are effected by the -950 in G*.

Another observation: New content that was just published yesterday ranked almost instantly in Foxstart, whereas it's still not found in Google main engine. (partial title search shows sitemap in G*, not actual page with the title)

In mid May, after cleaning what we felt **might** be hurting the site, we submitted a detailed reconsideration request to Google, and were told that there were no manual penalties against the site.

MH

lzr0




msg:4508880
 1:47 pm on Oct 17, 2012 (gmt 0)

I had one site at -950 for long time till I tried to de-optimize the OnPage SEO, it recovered in less then a month afterwords.


Hi, Donna,
Would you mind to provide some details, what exactly did you de-optimize (keywords in title, density, etc)?
Thanks

Gumtee




msg:4509221
 8:33 am on Oct 18, 2012 (gmt 0)

Thanks to this thread I think I'm closer to knowing what happened to one of my sites... After a remake to a more modern Magento e-commerce system, the site dropped from 1st to about 680 out of 700 for its main keywords. The site sells one type of curtain only, so most of the text and links refer to these 2 keywords (eg folding curtain and folding curtains (direct translation from dutch)).

In the last 6 months I have de-optimized, improved speed, fixed duplicate metatags (the whole webmaster tools tour), removed an hidden text help system (user clicks on question an sees answer) etc to no avail. Requested a re-submission, got the standard message that the site is not being manually penalized... It's just a standard e-commerce site that sells custom (folding) curtains, no affiliate links, no spam, domain is 4 years old.

If anyone has recovered from this, please share how you did it.

Donna




msg:4509671
 4:25 am on Oct 19, 2012 (gmt 0)

@lzr0 , All I did was just Home Page .

What I did :

1. Got my OBL(from 160-80)

2. Replaced all my image tags to 1-2 words, very basic description

3. Dropped down my main KWs based on anchor text to under 4% from 7% (sometimes KWs % doesn't matter but this time it night, I have seen websites rank well with over 14% KW density)

4. Watered down the page with about extra 2000 words, still very relevant and good quality text.

5. Made my ulr(domain.com) as the beginning of my topic

6. Got a lot of canonical issues resolved around the entire website

7. Removed a lot of bold text

8. I had Websites promotional panel at the top left corner which was dynamic , had it removed all together

What I did not do for more than 1 year -950:

1. Did not build a single link
2. Did not filled a reco..
3. Did not add new content
4. Did not panic !

How fast I recovered :

Less then a month after those changes I started to go -500 , -300 , -150,... and then i settled at page 1-2 for multiple kwds . Took about 3 months all together to reach 1st page and than I started to build links again :) , I still keep my RankTracker History and snapshots it was pretty dramatic.

Most of the time it might have been a canonical issue, or over use of anchor text KWs in your webpage and the links pointing to it.

gouri




msg:4509960
 7:15 pm on Oct 19, 2012 (gmt 0)

@Donna,

Great post. I was hoping that I could ask you a few questions.

Most of the time it might have been a canonical issue, or over use of anchor text KWs in your webpage and the links pointing to it.

Can you tell me if anchor text KWs means the keywords appearing in the body text of the page? Blue widgets, for example?

Also, does links pointing to it mean internal links going to the page or inbound links? And were these links using as anchor text keywords (e.g. blue widgets) that the page is trying to rank for?

2. Replaced all my image tags to 1-2 words, very basic description

By image tags, do you mean the alt text for the images?

7. Removed a lot of bold text

Did you have keyword phrases and related keyword phrases in bold? Or were entire paragraphs in bold that contained some of your keyword phrases?

[edited by: gouri at 7:42 pm (utc) on Oct 19, 2012]

lzr0




msg:4509963
 7:31 pm on Oct 19, 2012 (gmt 0)

@Donna
All I did was just Home Page ...


Thanks. All you described makes sense.

Donna




msg:4509974
 8:06 pm on Oct 19, 2012 (gmt 0)

@gouri :

A. KWs in your page body and the External link pointing to that page. Try to water that ratio down if possible.

B. Alt image tags

C. The phrases in bold were something like pointer on my page and were not intentionally there for the purpose of ranking but they were about 15-20 repeats of the same phrase in bold. It was dynamically driven php page .

Nichita




msg:4510000
 9:55 pm on Oct 19, 2012 (gmt 0)

I was thinking I am the only one in this situation. I have multiple sites affected by this problem.

I suspect few things:

1. Possible excessive internal anchor links (in the case of one of the affected sites);
2. I have used very frequent the Rank Tracker to check automatically the sites positions;
3. I've login in my Google account via proxy IPs (possible used by spammers);
4. Possible duplicate content (a lot of scrappers have copied some articles from my sites);
5. A high number of articles and a low quantity of backlinks.

The penalty is sitewide and affects 90% from my sites. Few of them are never monetized and have just few links / unique articles published.

The penalty in my case is 100% not determined by links quality. Clean sites are affected.

I suspect just somehow Google believes I am a spammer (automatically checking positions / different IPs account access etc.)

Other weird things about this penalty (in my case): the Facebook / Twitter / Google plus pages are gone too when I perform a search for "website name" (not "websitename").

These pages are ranking usually in the first page on Google for that term. Simply, Google want to remove all the mentions about these sites from the first pages of its result.

Nichita




msg:4510433
 5:13 pm on Oct 21, 2012 (gmt 0)

Do you have any experience with this type of penalty?

1. Have you used any automatically checking positions software ? Yes / No.
2. Have you used different IPs / login in your account in the same time? Yes / No.
3. Which number of articles do you post / day ? Yes / No.

I ask you that because I believe the affected sites have something in common. Maybe we can find the reason of this problem.

This 59 message thread spans 2 pages: 59 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved