Welcome to WebmasterWorld Guest from 54.205.119.93

Message Too Old, No Replies

Google's 950 Penalty - Part 11

   
4:22 am on Jul 23, 2007 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



< continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

Just saw one 950+ and it does my heart good to see it.

User-agent: *
Disallow: /theirlinkpage.htm

No, I'm not saying that's necessarily why, but it would serve them right if it was, for playing dirty like that on a page that's supposed to have reciprocal links with people exchanging fair and square in good faith.
======================================================
Added:

And another 950+, the last site in the pack. Flash only (not even nice) with some stuff in H1 and H2 elements with one outbound link. class="visible"

<style>
.visible{
visibility:hidden;
}
</style>
=========================================================
Another way down at the bottom is an interior site page 302'd to from the homepage, and isn't at all relevant for the search term - it must have IBLs with the anchor text (not worth the time to check).

Yet another must also have anchor text IBLs (also not worth the time checking) and simply isn't near properly optimized for the phrase.

So that's four:

1. Sneaky
2. Spam
3. Sloppy webmastering
4. Substandard SEO

No mysteries in those 4, nothing cryptic or complicated like some of the other 950+ phenomenon, but it's interesting to see that there are "ordinary" reasons for sites/pages to be 950+ that simple "good practices" and easy fixes could take care of.

The question does arise, though, whether the first two are hand penalties or if somethings's been picked up algorithmically on them - in one case unnatural linking, and in the other, CSS spamming.

[edited by: Marcia at 4:46 am (utc) on July 23, 2007]

[edited by: tedster at 9:13 pm (utc) on Feb. 27, 2008]

5:53 am on Aug 4, 2007 (gmt 0)

5+ Year Member



Our site 1 has tags (thinking about removing them). The site 2 doesn't, so it doesn't make a lot of sense to blame "overoptimization" because we didn't use to care about that.

How long will we last? I do not know but I doubt that we can stand this for another 6 months.

11:01 am on Aug 4, 2007 (gmt 0)

10+ Year Member



My impression is that this is an attack on old-style SEO. Old style webmasters are being affected by it.

I had to gut my site, leaving the keywords only in the <TITLE> tag and page name, and mentioned occasionally in the text. All other instances (alt tags, <H1> <H2>, title tags, keyword anchor text had to be removed or altered, if possible). A naive 21st Century webmaster wouldn't use these on his site, I think.

The aim of SEO is to mimic a naturally popular and relevant site; old-style SEO doesn't do this any more. I think Google has figured out some over-optimisation/irrelevancy flags, and coded them into its algo.

I think Miamacs made a good point when he said that if your keywords aren't supported by inbound links, you may have a problem. They also should have variations of the keyword in the anchor text, as a naturally popular site would.

If you've got enough good, relevant links, you're immune, as these are the basis of the Google algo.

9:20 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



"The aim of SEO is to mimic a naturally popular and relevant site;"

Maybe that is how it works on the planet Crapsite.

But here, the problem with 950 penaties is that they catch spammy crap sites, as they should, but they also catch some of the genuine pages the crap sites are trying to pretend to be.

The solution to crap sites with 950 penalties is basically "don't build a crap site". the solution for quality sites is totally, completely different.

10:24 pm on Aug 4, 2007 (gmt 0)

5+ Year Member



An important page is 950. Two other pages for the same search are between #90 and #110. Had someone success with deoptimizing other pages for the sake of a more important page?

BTW the results for this search (4 competitive words) look very overreacted, total results are +2.500.000 but at #550 it says “we have omitted some entries very similar to the 550 already displayed”. Repeat: Some Entries!

1:35 am on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've now changed many sites I manage to get around the 950 penalty, and I've got to say, I see it as a pretty #*$!ty penalty. My experience is that its a penalty for unnatural over use of a word or phrases, especially in anchors, but also if a word is simply repeated more times in text than natural.

My problem is, there is no way to enforce such a penalty, without lots of collateral damage.

Example.

I manage one site about alternative therapies. 50 different types, yoga, acupuncture etc. The site details who they are where they are etc.

So my site map goes like this

yoga town name 1
yoga town name 2
yoga town name 3
etc etc
accapuncture town name 1
accapuncture town name 2
accapuncture town name 3
etc

etc

There are no "Sorry, we are very interested in yoga town name but don’t seem to have any yoga town name practitioners perhaps if you are etc etc... target pages in the site map.

All pages the site map travels to, have unique content ye the site got hammered with and almost sitewide 950 penalty..

The problem as far as I can see is the overuse of the practitioner type, in anchors. I can not leave it out, as its to important to the user and I will otherwise have duplicate anchors, the town name is the only difference between the anchors.

I modified the site map, so that the order of the towns and practiioners types is now random, as is the order of the words eg

Yoga town name 1
Town Name 2 Accapuncture
Massage Town Name 3
Etc etc

with no more than 50 links per site map page.

A week after implimenting this change, the 950 penalty dissappeared, no other changes

So in order to keep my correct, descriptive site map, I need to forget giving a priority weighting in my site map to the more popular towns and practititioner types, and have a rubbish sitemap.

Its rare for me to criticize google, but in this case I have to. Collateral damage is always going to be an issue with filters, but in this case, there are so many instances where the 950 penalty hits legitimate sites that repeat keywords in anchors, I’ve got to say I think they have got it wrong. I can get around it, but its still wrong.

I am told Google reads all posts on this forum, I hope it is true. Google, you have got it wrong, too much collateral damage.

1:01 pm on Aug 5, 2007 (gmt 0)

10+ Year Member



SteveB, the Google algorithm isn't intelligent. A 'crapsite' can look just the same as a 7-year-old labour of love: not enough good inbounds, excessive keywords, unnatural internal linking.

Indeed, one could argue that the glory of the internet is that a 'crapsite' can compete with a PLC, and beat it.

Otherwise, it'd just be a computerised Yellow Pages; the company with the most money getting the biggest exposure.

8:53 pm on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



For every crapsite that looks like a genuine site there are 1000 crap sites that fail at fooling Google.

"the company with the most money getting the biggest exposure."

The company's with the best SEO, best promotion, best content, most crawlable designs, best business plan, etc, are the ones who get the most traffic.

10:03 pm on Aug 5, 2007 (gmt 0)

5+ Year Member



Gee,I didn't realize it was so simple. A site is either a "crap site", or a quality site, with nothing in between?
10:17 pm on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Where'd you get that idea?

It's true though that with the 950 penalty you either got it, or you don't. It's a drop off a cliff.

12:59 am on Aug 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



And for those who aren't clear on how steep that cliff can be, I'll give you one example...

I have one site that is a Google News source site. It gets articles into GNews with approximately a 10 minute delay --I publish an article, and within 10 minutes it's showing up in GNews.

Now, after the -950 has hit, when I do a text-snippet search for text in older articles, they are ALL filtered behind the "repeat the search with the omitted results included" thing. Clicking "repeat the search with..." shows the articles usually listed behind numerous PRzero splog scrapers which have ripped-off the articles, including one in particular I noticed last night where the entire right column contained nothing but spam links with the anchor text of "[N]ude [C]elebs" repeated 30+ times down the right margin.

So, in other words, a Google human thinks this site is good enough to be a GNews source site, but the algo thinks it's 950 trash.

THAT is how steep the cliff is.

2:03 am on Aug 6, 2007 (gmt 0)

10+ Year Member



So my site map goes like this

yoga town name 1
yoga town name 2
yoga town name 3

accapuncture town name 1
accapuncture town name 2
accapuncture town name 3

...The problem as far as I can see is the overuse of the practitioner type, in anchors. I can not leave it out, as its to important to the user and I will otherwise have duplicate anchors, the town name is the only difference between the anchors.

You can eliminate much of the redundancy without reducing usability if you group your navigation, and us labels above each group:

yoga
town name 1
town name 2
town name 3
etc etc

accapuncture
town name 1
town name 2
town name 3

Of course, that would eliminate yoga and accapuncture from the anchor text, but you could reintroduce those terms on a more selective basis with links that aren't hardwired onto every page as part of the core navigatiojn.

11:37 am on Aug 8, 2007 (gmt 0)

5+ Year Member



Hi guys,

Not posted for a while because I thought I was banging my head against a wall and that my positions would never change.

Then all of a sudden 2 weeks ago BANG all of our pages were back in the SERPS like pre jan 2007.

Unfortunatly woke up this morning to find that we have been pushed back to page 5 & 9 again.

Cant work out whats going on has any body else seen a 2 week boost, only to be pushed back again.

Whats it all mean!

Will we ever have a stable google ever again.

12:05 pm on Aug 8, 2007 (gmt 0)

5+ Year Member



Hi jk3210,

I have exactly the same senario as you. We have been in Gnews since 2004 but in most cases scapper sites beat my positions in the natural SERPS.

5:54 pm on Aug 8, 2007 (gmt 0)

5+ Year Member



c41lum -

Sorry to read your pain. A site I've been working on suffered the -950 since december, and also came back to life 2 weeks ago. July 23rd.

It has risen in a series of steps up since then.

Today saw a big step up, so they probably rolled out some changes that pulled you back down. Hope you get back out soon - is a miserable feeling.

6:56 pm on Aug 8, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



c41lum-

Yeah, it's a lot of fun to watch someone else profit from your efforts, isn't it?

I don't think being scraped (even full-content scraping, which is the up and coming thing in the world of weasels) is the cause of the 950 penalty, but the 950 penalty will certainly magnify the damage scraping causes by 100-times over.

Additionally, it's hard to believe that 2,000-4,000 word articles written by professional journalists who don't know squat about writing SEO'd, keyword-driven copy, would trip some "Phrased Based" filter, but that explanation is no better or worse than any other explanation I've heard --none of which really makes any sense.

8:02 pm on Aug 8, 2007 (gmt 0)

5+ Year Member



Argh, 950 again. I actually came back July 31 but now I'm down again. Watching sites or individual pages go down 950 then up again, then down, makes me wonder what G is trying to achieve here?
10:53 pm on Aug 8, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



econman

on using labels, its not as simple as that. On the same site, I have location infomraiton, so a link to town name, needs to be a link to info about that town name.

I am convinced that large sites that have to do duplicate anchors, offset by labelling, can not compete with smaller sites set up more accurately - never seen an example where this was not the case, of course if anyone wished to sticky me an example....

1:25 am on Aug 9, 2007 (gmt 0)

10+ Year Member



Hey Guys....

I was reading this thread from the beginning and Tedster and SteveB have made a lot of wonderful contributions... but the one post from NETMEG at the very beginning of the thread - Back on May 28 I think seems to shed the most light on this "update"

I'll try to summarize NetMeg....

2 word search terms on NetMeg site seemed to have been -950
However, reversing the order of the two words...the site still showed up at the top of the SERP

In my case... I have a KW that can be made up of 1 word or split into 2 (thats the english language for you).
I reverse the split word when I type in the search.... and show up number 2.

I have optimized my page for the 1 word keyword which most people use and a secondary optimization on the 2 word keyword which SOME people use - but not quite as often.

I type in the KW with 1 word.... and I show up (sorry NetMeg) number 8.

So.... is this update or - to correct for this update - actually a "Semantics" update?

Are we to think that a little tweaking of the 1 word search term that knocks us out can bring it back?

Does anyone have a variant of this that they can share?

Can anyone verify this anamoly with their own sites and experience?

Thanks
ARC

9:45 pm on Aug 14, 2007 (gmt 0)

10+ Year Member



I had a site come back after removing duplicate content. I have another with more pressing duplicate content issues (competitors syndicated the content to article sites) not come back.

I was given absolutely no support from Google on both sites, despite asking for help on one site. DMCA complaints did not resolve the problem.

Of course the one site has also not had the content replaced as the sxcope is much larger.

My personal opinion is this is a duplicate content filter for SOME sites.

Check your content on multiple pages in quotes and see if there is an issue.

7:57 am on Aug 25, 2007 (gmt 0)

5+ Year Member



My site has finally returned. I think we dropped on around the 18th of July or something. It came back yesterday. There is no doubt this penalty exists in my mind. We were ranked around 980, even told us this in webmaster tools for our main keyword. Yesterday we jumped up to 4th, the highest position we have had is 3rd, that was before the drop. So looks like we are back.

As with everyone else, there is no way to know what exactly you did to change this, or if anything you have done had an effect. First thing I did was to de-optimise. Frankly we are now so de-optimised that we shouldn't even be ranking for our main keywords.

I also found that a site we were doing a link exchange with was banned, and we were still linking to them, so I removed this link. I also removed 2 sitewide links that were pointing to us with our main phrase in the link text.

Also it could be the mysterious sandbox effect that you hear about. This site is only 2 months old, so I dunno what is going on. But at least its back, for now.

9:35 am on Aug 25, 2007 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Also it could be the mysterious sandbox effect that you hear about. This site is only 2 months old

There's nothing mysterious about the sandbox. You're in it.
12:23 pm on Aug 25, 2007 (gmt 0)

5+ Year Member



2 subfolders of an established site gone 950, I've now lost pretty much all my hand made sites. How stupid of me to think wh works, I'm going back to black.
12:50 pm on Aug 25, 2007 (gmt 0)

5+ Year Member



I even lost a forum site because deleting spam messages caused "thread not found" pages and therefore duplicate content. The ...Google algo decided that I was trying to play tricks with too many "thread not found" pages and decided to penalize my site by dropping it to the last page for every search.....

[edited by: Robert_Charlton at 5:59 pm (utc) on Aug. 25, 2007]
[edit reason] removed Google-rant [/edit]

7:54 pm on Aug 26, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have just had this happen to a site which is around 6 years old. On 21 August.

It is site-wide. Some of the terms for which the site has ranked at no. 1 since its inception all those years back, now rank at 900+

The site has been pretty much abandoned for 2 months. No new additions. Before that, new stories used to get added at least 4-5 in a week. Got busy with other things, but this has been a steady site forever, so thought I could trust Google. Apparently not!

Important facts:

All pages on the site link to all categories in the site on the navigation bar. They also link to 10 related stories from the same category. Also, all stories link to the homepage.

The homepage has had 100 links or more for several months now. To categories as well as top stories from within each category.

What I have done:

Removed half the links from the homepage.

What I did some six months back:

Submitted the site randomly to some 50 directories on the Net. (Normally external links should not damage a site, or anyone could do it to anyone, and all that, so couldnt do any damage is what I thought).

What I plan to do:

1. The ten related stories each story links to are in a sidebar include. The anchor text for all them is the full story headline. Planning to change that, and link with only a phrase from each headline. So there will be a link and some related non-linked text next to it. Might reduce the intensive internal linking problem if it exists.

2. Write a few new stories, and get links to them from social networking sites.

But first, waiting to see if there is any quick result to the link reduction from the homepage.

If I am doing something which I absolutely should not, please advise.

7:34 am on Aug 28, 2007 (gmt 0)

10+ Year Member



I was just reading this feed last night...
My site got kicked off the first page about a year or so ago. I figured I had over optimized the links to the site.

I checked to see if my site was on the last page. Sure enough it was.
Last night I decided to take out as many keywords as I could whilst still trying to make sense.

24 hours later its gone from the last page straight back to the first!

I just looked at about 6 other sites I run they all seem to have the same penalty. I will fix them tonight.

9:53 am on Aug 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Overoptimized links TO the site or overoptimized internal links?

I thought it was the second which led to -950..

1:33 pm on Aug 28, 2007 (gmt 0)

5+ Year Member



Gavolar,

did you remove keywords from the links to your site
or did you remove keywords on your site?

2:42 pm on Aug 28, 2007 (gmt 0)

10+ Year Member



Overoptimized links TO the site or overoptimized internal links?


did you remove keywords from the links to your site
or did you remove keywords on your site?

I think it was about a year or so ago, google started putting penalties on over optimized incoming links.
I have changed my incoming links ever since.
I didn't attempt to change any of the links to the site. I just changed the links and content on the site (internal)

However I am starting to be a little skeptical that it may be a coincidence, as it happened so fast and the google cache is still showing 10 days ago.

8:23 pm on Aug 28, 2007 (gmt 0)

5+ Year Member



My site, and anothers sites I was following have come back from -950 yesterday, I think this is a consecuence of a change at google and not a change in any of this sites.

My site was in -950 for 11 months, I hope this changes be permanent...

11:20 pm on Aug 28, 2007 (gmt 0)

10+ Year Member



TaLu:

by any chance did your site come back for just 1 day a few weeks ago?
As mine did and maybe this is just a temporary thing.

This 161 message thread spans 6 pages: 161
 

Featured Threads

Hot Threads This Week

Hot Threads This Month