homepage Welcome to WebmasterWorld Guest from 54.147.196.159
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 51 message thread spans 2 pages: 51 ( [1] 2 > >     
Google's 950 Penalty - Part 14
arubicus

10+ Year Member



 
Msg#: 3691419 posted 6:20 pm on Jun 26, 2008 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

If I am remembering correctly that was pointed out in the many hundreds of comments on this topic is that there are more than likely different variations of this penalty. Some see it BEFORE the filters and others AFTER the filters. If this is correct it also seems that the AFTER the filters tends to be more about over optimization. Before the filter seems to be something else or found during (re)-indexing (trust, pr, incoming links, dupe content, etc.) Also the before the filter seems to be site wide rather than specific terms. This is just theory though based on experiences shared in this.

The excessive navigation and keyword/phrase stuffing...If you read back I can show you many low popular, low pr, and weak linked sites that stuff away and remain where they are. If you are worried about duplicate blocks of text also keep in mind that you will find tons of blogs and sites with duplicated article summaries. Again with low pr and low links etc.

What I have and haven't done so far (on site stuff):

1) Page titles using different keyphrase order than links to the page (the whole phrase can be found in title and link but different order). Page titles with omitted parts of phrases than links. Page titles similar but different words than links.

2) Navigation now is as simple as possible. There are no stuffing here and never really has been. No cross linking into unrelated sections. Also tried cross linking sections that are related in the same category - seen no difference.

3) Meta keywords...Existed and now removed. No difference. Tired of messing with them anyway.

4) Meta description...All unique (webmaster tools reports one short description and no duplicates). The descriptions do have the main phrase. Tried reordering and reducing occurrences in relation to what is found in title and links. I am holding off removing or rewriting them completely across the site. A few thousand descriptions is just a bit much right now.

5) Title of page matched h1. I have tested using differing h1 and titles. Again using same phrase words as well as mixing them or keeping them completely different.

6) I do have a related article section at below articles. I haven't removed them as this is keeping us alive with traffic flow and visitor retention. This related article section of course uses the title of the article which is in the h1 tag of the article. Tried different variations, mixes, similar but not exact words/phrases.

7) I have tested sections with article summaries, links only, and unique descriptions. No go.

8) I have links on bottom of my pages to TOS, Privacy, About us all no followed and blocked by robots. Don't use any other footer links...never had.

9) H1 tags are used for titles of each page. Used various tags, h2, h3, and the like but no go. I have yet to remove the tag completely.

10) Duplicate content. Never really had any complete duplicates. Have my php scripts worked in a way that only 1 variation is accepted and will generate a page. Query strings aren't allowed. A 404 is sent automatically. Yes this works properly. Took down and still fight those darn content and site thieves. Never ending battle.

11) Duplicate content again. We do have re-published content as well as our own. Gonna work the re-published stuff out on our next step. Our own content is -950'd. Whether it be fresh or old it just gets -950'd. There is interlinking between re-published and unique content through the related articles section. Also replaced re-published with unique that was on theme using the same url (links established to the url). No go. We tried new urls for new content and established links over time. -950'd at the get go.

12) The site is done in CSS which validates and HTML 4.01 Strict and validates. With CSS I don't get tricky or use anything that would be considered spam, cloaking, black hat, or unethical. With CSS turned off the page reads like a book and main navigation at the bottom. This navigation may appear as footer links. Another thing I could test.

13) I do not use any cookies or sessions except in one area that is blocked by robots.txt in a form.

14) Have tried static html pages rather than php generated pages (.html extensions). No go.

15) I know there are more things but can't think of any right now.

Any way this has been done over the course of 2 years. Test...wait...Test...wait...Test...wait...Test...wait. Insanity is setting in :)

Next to work on is off site stuff.

Thoughts appreciated.

[edited by: tedster at 2:20 am (utc) on July 6, 2008]

 

kcguitar

5+ Year Member



 
Msg#: 3691419 posted 6:48 pm on Jun 26, 2008 (gmt 0)

arubicus

Your site sounds similar to my penalized site. Combination of reprint and original articles broken into sections. This site has bounced in and out of penalties for several years. I've also gone through most of the site changes you mentioned (validating, unique titles & descriptions...). I'm in the process of rewriting and updating all reprint articles to eliminate any duplicate content. Thinking maybe it's too much old & dup content site wide.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3691419 posted 7:13 pm on Jun 26, 2008 (gmt 0)

"I think the -950 penalty is about content not links. are you all 100% sure the content is all right?"

Hardly anything causes a -950 penalty. That means that if 100 websites do the same things with the same types of pages only a smallish number will get a penalty. I've never seen 950 penalized pages that had any menu issues, but apparently some folks think that is the boogeyman based on their sites. All I've ever seen is when pages score too high in an anomalistic way for a term or variations on a term, Google gets skeptical and hammers the page. There are a gajillion ways to make an anomaly though, which is kinda the point of being an anomaly.

arubicus

10+ Year Member



 
Msg#: 3691419 posted 7:40 pm on Jun 26, 2008 (gmt 0)

kcguitar,

Yeah a theory of mine.

Re-published articles that used to rank or give weight internally have been shot down.They just don't spread any weight across the site internally. Even with good incoming external links to re-published articles. The link weight does not get shared.

The percentage of unique and republished articles is pretty equal on our site. The downfall is that our site is wide spread. So this thins out the weighting even more.

So if I have a unique article that links to a re-published article then an amount of weight from the unique article gets passed to the re-published article then lost. Link to two re-published articles and even more is lost.

Think if you have a section that links to 2 articles. 1 re-published and one unique. Weight gets passed to both and lost to one. The unique links to the re-published and thus more lost. Any incoming link weight to the article or section pretty much gets cut in half. Depends on your site structure you can loose a lot of weight or possibly orphan many pages.

Another theory is just remove any duplicate pages. What you are left with then look at the structure from then on. It could possibly break up pr flow, orphan pages, or just create a mess.

Yet another theory is a penalty linking to and/or from duplicated pages. Back when you could load deep re-published content on your site by the thousands. Keep your unique higher up in the structure. Any link and weight from those re-published content would pass to your unique giving them a huge boost. So a penalty could be applied to prevent this.

Just theories of course. But a site wide -950 penalty gets me to wonder.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3691419 posted 2:11 am on Jun 27, 2008 (gmt 0)

Yet another theory is a penalty linking to and/or from duplicated pages. Back when you could load deep re-published content on your site by the thousands. Keep your unique higher up in the structure. Any link and weight from those re-published content would pass to your unique giving them a huge boost. So a penalty could be applied to prevent this.

Just theories of course. But a site wide -950 penalty gets me to wonder

If the greater portion of your site is non original content, then the strength of those pages will lower the overall strength of the site's ranking. This can drag down apparently strong TBPR pages and be site-wide/

Pointing strong links into these duplicate pages has no benefit until they are fixed. Probably it will make it worse. Google likes unique content that users find useful.

So my view is:

Unique Titles
Unique Meta Description
Unique Content [ hand written - not auto generated ]
Remove pages with little content on them and merge them where possible to other pages.
Do not interlink pages with the same content. Even snippets could be a problem - as I think the patent re phrase matching seems to suggest.

The aim of each page is for it to score highly in it's own right, and contribute independently as part of a unique site.

Somehow, your content needs to be seen as an authority.

However, implementing this is a often easier said than done.

OnlyToday

5+ Year Member



 
Msg#: 3691419 posted 3:20 pm on Jul 5, 2008 (gmt 0)

Having recently been released from the grips of a six month -950 ordeal (I coulda bought a new Toyota Prius with what I lost...) I do feel obligated to share this. Please forgive me if I'm wrong, but speculation based on recent behavior is all we've got here.

Does your "laundry list left hand column" use relative or absolute links? I think I got the penalty lifted by paring down the lh column drasticly but I'm thinking now that relative links may have worked just as well.

BTW the penalty was lifted six weeks after I made any significant changes to the site.

soxos

5+ Year Member



 
Msg#: 3691419 posted 10:01 pm on Jul 5, 2008 (gmt 0)

I might be stating the obvious / already stated here but I may have another clue. On one of my sites where some (20% max) pages are 950edI have noticed that most of these pages are not cached, and whats more whilst they have tons of unique contect, the subject is very similar to the home (eg a complete overview / explanation of 1000 words).

OnlyToday

5+ Year Member



 
Msg#: 3691419 posted 10:44 pm on Jul 5, 2008 (gmt 0)

soxos' clue didn't apply to my site which was 950'd uniformly, all the pages were down 80% off their previous (and current) Google traffic. Yet during the penalty all the pages remained indexed and cached and my company name and domain were beautifully sitelinked. Though YMMV seems typical for these matters.

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3691419 posted 8:01 am on Jul 22, 2008 (gmt 0)

My main 950'd site has recovered for most KWs except several of the major KWs, including the primary KW for the industry: "widgets."

I used to be at about SERP #11 pre-950. Now I'm #130. Of course widgets is the most competitive KW. It is common knowledge slash suspicion here that: 1) the 950 can apply to individual KWs as well as sites; and, 2) there is a different algo for competitive KWs.

Anyone else recover from the 950 except for one or two highly targeted KWs? Or have that happen and then get full recovery? I have no idea what to do next. After a lot of recovered KWs I'm reluctant to make major changes in the concern the progress could be reversed. But it sure would be nice to get ranking for widgets again and some of the others.

I don't know, frankly, if the incomplete recovery is based on continued 950 issues or if the main "Competitive KW Algo" (separate from the 950) now ranks sites differently from how it did before I visited 950 hell.

p/g

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3691419 posted 8:17 am on Jul 22, 2008 (gmt 0)

How long have your sites been recovered for ?

Did you make any ammendments ?

If these have just recovered, Google may be just recalculating the pages. It might be wise to be cautious on things, rather than get too excited too soon. But it's good to see you are seeing some positive signs.

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3691419 posted 4:37 am on Jul 23, 2008 (gmt 0)

> How long have your sites been recovered for?

June 4 was the breakout day. Ironically at the same time when other sites dived. I think a month is reasonable amount of time to see changes. Google seems to do most of its stuff (updates) in a month or less.

> I might be stating the obvious / already stated here but I may have another clue. On one of my sites where some (20% max) pages are 950edI have noticed that most of these pages are not cached, and whats more whilst they have tons of unique contect, the subject is very similar to the home (eg a complete overview / explanation of 1000 words).

Similarity is one of the issues webmasters suspect contributes to the 950 penalty or the Phrase-Based Spam Penalty. One Google patent seems to indicate the idea behind it is spammers target like a gunshot v. rifleshot, aiming to saturate a wide swath of similar words and phrases (e.g. instead of just car, also vehicle, auto, automobile, etc.). So project that to a large topic and every related search phrase... lots of very similar phrases... that could get you 950d/penalized.

Incidentally, Matt Cutts recently made a comment that makes me wonder if Google is more sensitive to phrase repetition aka keyword density than it used to be.

When weaving keywords into a main page, Cutts says, some zealous Web publishers will use the term over and over again. That's called "keyword stuffing." It's a big Google no-no that can have your site removed from the index. "After you've said it two or three times, Google has a pretty good idea 'OK, this page has something to do with this keyword,' " he says. "Just think about the two or three phrases you want to be known for and weave that in naturally."

(I recently got penalized for a page that used a phrase more than two or three times. I removed the repetition and bounced back completely.)

> Remove pages with little content on them and merge them where possible to other pages.

I absolutely agree. Thin pages are high risk pages especially when there are lots of links but little content. Spammers generate millions of pages quickly that are worthless. If you are creating thin pages which aren't really justified (why a separate page if it has almost nothing on it?), you're at risk. Same again, if you put a lot of links to a page with very little content. The Google algo has to wonder, How can that be justified? Why are they linking to nothing? Unless it's to generate more internal links to a page in the hopes of higher ranking?

> 6) I do have a related article section below articles. I haven't removed them as this is keeping us alive with traffic flow and visitor retention. This related article section of course uses the title of the article which is in the h1 tag of the article. Tried different variations, mixes, similar but not exact words/phrases.

This you might need to change. I think my changes on related articles freed me from the 950. I had dupe anchor text on the same page. I didn't mess around with changing the anchor text except to use an icon with a link, and just stripped all the anchor text of the link, so it still appears as text right beside the icon. With related documents you can just use a small file icon (like many news sites). It's so common it won't confuse users.

Related articles

[icon+link] Article title [no link]

p/g

atlantis76

5+ Year Member



 
Msg#: 3691419 posted 8:06 am on Aug 4, 2008 (gmt 0)

I'm out of the -950 penalty after 3 years! Oh my, I've been waiting so long!

I have mysite.com on a penalty ever since I used some unacceptable practices to promote it, several years ago. I was then restless and obviously didn't know my limitations.

Ever since the site came up for "mysite.com" search, however, if I'd searched "my site com" I wouldn't see it. Needless to say, the site didn't come up for any search and was stabbed down there at the end of the SERPs.

Thinking of it, I haven't done anything special about it lately (I am making a new site from a scratch these days, but it's not yet finished nor uploaded) - except of one thing: buying a link for it on Yahoo's Directory.

Either this "comeback" might resulted from this link in Yahoo or not, I did wanted to report this.

If there are questions that I can answer to help any of you guys - you're more than welcome.

Regards
Assaf

[edited by: tedster at 8:41 am (utc) on Aug. 4, 2008]
[edit reason] moved from another location [/edit]

efendi

5+ Year Member



 
Msg#: 3691419 posted 9:51 am on Aug 4, 2008 (gmt 0)

hello,
first of all excuse me for my bad english, i come from germany. i learned here a little bit of seo und much more english with reading this forum. thanks for the good job here.

"Either this "comeback" might resulted from this link in Yahoo or not, I did wanted to report this."

is this possible? i have a link from the german section from the yahoo directory for a long time and nothing happen. my site is now gone for 16 month ...

greating
efendi

[edited by: tedster at 9:54 am (utc) on Aug. 4, 2008]

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3691419 posted 10:03 am on Aug 4, 2008 (gmt 0)

Welcome to the forums, efendi. Yes, it sometimes seems that one good link with the right anchor text has bumped a site out of the -950 penalty. But not always -not by far. So it's not so simple in many cases.

This very long discussion has now been going on for 18 months, so that shows you how difficult it can be.

Here's a summary [webmasterworld.com] of the discussion, and here's the beginning [webmasterworld.com].

efendi

5+ Year Member



 
Msg#: 3691419 posted 10:33 am on Aug 4, 2008 (gmt 0)

thank you, tedster,
i read this discussion so many times, you can't believe it.
try everthing, have good links (my competitors give some good links too) and now i think i have the best site on the earth :-)

(the site is over ten years old and before the penalty catches me, it was one of the strongest sites in the germany travel section).

in the next steps is the only thing i do is crossing my fingers and wait.

greatings

Crush

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3691419 posted 10:52 am on Aug 4, 2008 (gmt 0)

A site went on the weekend but not even -950 I would say far worse:

cache: = nothing
site: = no pages 9 (previously 7000)
backlinks: still there
page rank= pr 7 but probably not after next update

Looks like a manual ban to me. Opinions please?

JoeSinkwitz

5+ Year Member



 
Msg#: 3691419 posted 2:03 pm on Aug 4, 2008 (gmt 0)

Crush,

I've seen that happen in two ways:
1. The site comes back like nothing happened, as though a data glitch dropped everything and re-added for that one domain.
2. Tons of dupe pages nuked and never returned.

I hope you get the first scenario and see a recovery in a couple of days; 2-4 days is my waiting period depending on how often the site was recached in the page.

silverbytes

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3691419 posted 8:59 pm on Aug 4, 2008 (gmt 0)

<moved from another location>

I'd like to hear experiences of 950 penalized sites and how long did take to recover rankings after cause was corrected.

Once the "offending" cause is fixed I guess site should recover ranking health, and I guess since site is crawled and updated some time passes, but I also guess must be a "punishment" component in time too.

In your case how long it was? 1 month? 3 months?

[edited by: Robert_Charlton at 9:19 pm (utc) on Aug. 4, 2008]

OnlyToday

5+ Year Member



 
Msg#: 3691419 posted 3:40 am on Aug 5, 2008 (gmt 0)

My site emerged from the 950 six weeks after I had given up and had stopped trying to fix it. I think, but am not sure, that I shook the penalty by reducing the size of my internal menu.

My traffic is now about 80% of what it was before the penalty hit, but during the penalty it was ~20%.

rowtc2

5+ Year Member



 
Msg#: 3691419 posted 7:39 am on Aug 7, 2008 (gmt 0)

They are making algo for 950 penalty more rafinate. I am convinced they are reading all these posts to be informed what is happening after their actions and to make their filters more strong.
They are making some randoms for "health" serps and for 950's serps to be hard to understand alghoritm.
The rules for keyword density are not the same for lider sites and for little sites.

I have see some pages are in different positions in SERPS in different days and i think i increase a little to much keyword density and keyword in anchor.Not very spammy.
I was made some changes to my site at the begining of the August month.

In 2 days after the modification visitors drop down 100% from google (from 10k/day). I have maded back modifications and now i am waiting.

Anyone experienced time to escape from filters or penalties?I have seen competitor who escape from penalties and others who are in continously in Google Hell .

JackR

5+ Year Member



 
Msg#: 3691419 posted 1:28 pm on Aug 16, 2008 (gmt 0)

On Monday 11th August my site was hit with what I believe to be the -950 penalty and vanished from the SERPs for keyword searches within a matter of hours. A few internal pages do appear in the 900s for searches but the homepage is nowhere to be found.

Fortunately for me the homepage and eight sitelinks are still in the index, and are returned first when doing a site: or example.com search.

Following a reply from Google's John Mueller to a forum post, I'm reasonably sure the penalty was applied due to the existence of links that were purchased two years ago and to the presence of a "Links Exchange Directory" on the site. Interestingly enough, the paid links were cancelled back in 2006 but are still present on a network of sites based in Canada and the USA. My requests to have the links removed have been ignored.

In an effort to fix whatever it is that the Big G doesn't like, I have literally neither slept nor eaten whilst working through navigation, linking and keyword hyperlinking over the past week. I've manually gone through almost 300 pages searching for errors and documenting changes that have been made for inclusion in a Reconsideration Request. Having spent so much time going through the site, I've actually come to the conclusion that this has been a very useful albeit steep learning curve.

Having been kicked from Google following 72 hours of server downtime and a brand new site design, it really hurt to get pulled from the index. Bizarrely enough, the sensation of seeing your organic traffic drop to zero can only be described as akin to bereavement. It really is a grieving process until you decide to fight back and rebuild what's been lost. That said, it's the need to prove to ysourelf that all the hard work has been worth it by at least getting Google to recognise and reward your site accordingly. That's why on Monday I'm going to file a two-page Reconsideration Request and report back if and when there is any improvement.

rowtc2

5+ Year Member



 
Msg#: 3691419 posted 7:12 pm on Aug 16, 2008 (gmt 0)

I am sure link exchange can hurt a site in some conditions, but one way links should be ignored - not a penalty .
In the second case a competitor can ruin your SERPS and this should not be allowed by a search engine.

rowtc2

5+ Year Member



 
Msg#: 3691419 posted 7:33 pm on Aug 21, 2008 (gmt 0)

Today in the morning my site reappear for main search queries,after 2 weeks of depresion and sadness.Many hours of work and boom , 100% traffic drop from Google.Looks to be out of 950 penalty (for all important searches i was in 900-1000 position, now i am back in top 10 or top 20).

I was tried to stabilize Yo-Yo results in serps overoptimizing with keywords and have a filter. I think Yo-Yo effect is something normally and must be accepted because we dont know yet how to make it stable.Is good because exclude monopol of bigger sites.

I removed all pages with poor content, i have reduced keyword density drastically,i changed title and metatags, no duplicate titles in WMT.
What can i say , i am from disaster back to good days, now i have money to pay the server at least.

Also , i have noticed unnatural optimisation (forced keyword and internal anchor density) for big old sites competitors, but they are not affected, they are more "friendly" with Google algo.

Now I am very happy and i want to thanks to Tedster, potentialgeek and other members usefully posts who helps me to get out from the prison, and of course, to Google :).

bwnbwn

WebmasterWorld Senior Member bwnbwn us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3691419 posted 8:03 pm on Aug 21, 2008 (gmt 0)

JackR that is the spirt that will get ya out of the filter and rowtc2
congratulations.

JackR

5+ Year Member



 
Msg#: 3691419 posted 8:14 pm on Aug 21, 2008 (gmt 0)

Congratulations rowtc2!

That really is fantastic news and I'm sure you are very happy now.

Can I ask: did you submit a Reconsideration Request using Webmaster Tools?

rowtc2

5+ Year Member



 
Msg#: 3691419 posted 5:40 pm on Aug 22, 2008 (gmt 0)

JackR , i don't think submiting a Reconsideration Request will solve the issue. If the site is still overoptimised they don't have a reason to show you in serps and i don't think they will tell you what to do to not be filtered.

I don't have submit a Reconsideration Request, but i cleaned my site been brutal with me, like Tedster have sugested, to be prepaired to submit this reinclusion to their search engine.

Of course, a lot of pages was lost and traffic is slower, but now i am working to deliver useful content to users.The growing will be slower ,but more stable i hope.

Try to not enter in a psyhical sadness and find all the elements who are not serving just to the users, the problem have a key to be solve it and just you can find it, with the site in your face.

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3691419 posted 4:22 pm on Aug 23, 2008 (gmt 0)

I think the 950 penalty is determined by an advanced algorithm which human review could not evaluate. It's not like there's one issue that can be easily seen by a human review and the reviewer can immediately see, "Oh, yes, it's fixed, and should no longer be penalized."

None of my recovered sites received reinclusion requests.

p/g

Northstar

5+ Year Member



 
Msg#: 3691419 posted 8:40 pm on Sep 9, 2008 (gmt 0)

<moved from another source>

I have two similar style directory sites. One lost 70% google traffic last year and the other just lost 70% of traffic this year. Both sites had their top keywords dropped from #1-#5 down to #500-#600 and some words vanished completely. Both sites still have some top keywords remaining but they are not popular KW's. I thought maybe the problem was my keyword density but when I checked my different URL's with kw density programs the highest word density on a couple pages was maybe 10%. Does this sound like a -950 penalty or something different?

[edited by: Robert_Charlton at 1:02 am (utc) on Sep. 10, 2008]

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3691419 posted 1:19 am on Sep 10, 2008 (gmt 0)

keyword density

Northstar - It's hard to generalize on the basis of any one factor. All we know for sure about the "penalty" is that it's an over optimization penalty of some kind. It's also been observed that inbound linking weakness, excessive repetition of keywords in menus, etc, might be a contributing factor.

The penalty or filter (or combinations of these) is phrase specific, and it generally affects the most competitive phrases, so you can have a page do quite well for one phrase yet completely tank for another.

I believe that Google has been increasingly applying its filters to less competitive phrases, either because the growth of the web has made these phrases more competitive, or because they're systematically revisiting all algo factor and refining them.

What you describe does sound like a classic -950 or end of results "penalty." Your previously high rankings are part of the pattern.

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3691419 posted 3:43 am on Sep 10, 2008 (gmt 0)

Keyword density may be a factor for the 950 penalty. I had one page that was #1 for one phrase, and I decided to add more content to the page. Then it disappeared from SERPs. The new content had included more uses of the competitive phrase but not in a typical spam way. I had just written more and did so naturally.

Then I reverted the page to its pre-development state (when it was #1), and shortly after got the ranking back. The only changes made to the page before the penalty were on-page text, not links (anchor text), or anything else. That's why I suspect it was a keyword density issue or something like that.

I don't necessarily believe it was keyword density because it often sounded like an internet myth which Matt Cutts and others never confirmed. Google probably has different and more intelligent ways to detect spam than plain keyword density.

The surprising thing was the phrase in question isn't that competitive.

p/g

This 51 message thread spans 2 pages: 51 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved