homepage Welcome to WebmasterWorld Guest from 54.163.72.86
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 199 message thread spans 7 pages: < < 199 ( 1 2 3 [4] 5 6 7 > >     
Google's 950 Penalty - Part 12
onetry




msg:3492371
 10:24 am on Oct 31, 2007 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

It is totally nonsense for me to worry about TBPR when I was badly hitted from a -950 penalty (look! my PR raised in almost all pages ... but as I said who cares?).

Please all -950ers come here and join this thread to group possible causes.

Here are mine:

1) E-mail to Adsense team about an account creation with domain name
2) Too many adsense boxes
3) Midly Over-optimized pages
4) Too similar titles
5) Some directory links (as almost all my competitors though)

I add that in last months no big changes were done!

Join -950ers power :-)

[edited by: tedster at 9:08 pm (utc) on Feb. 27, 2008]

 

brinked




msg:3515418
 6:03 pm on Nov 28, 2007 (gmt 0)

Did anyone ever think to consider that the 950 penalty is or is related to being sandboxed?

I bring this up because I have done a lot of research on this, and it seems that the "950 penalty" symptoms are very similar to the "sandbox" symptoms. The only major difference is that the sandbox is often discussed when a website is relatively new, but who says established sites can't be sandboxed anyway? Excuse me if I am beating a dead horse here and this has already been discussed, I'm new here but not new to the SEO forums world.

CainIV




msg:3515425
 6:08 pm on Nov 28, 2007 (gmt 0)

I think the penalty being lifted also corresponds to the date at which Google next caches your website.

Possibly if there is a correlation to a reinclusion request, then Google 'removes flags' the website and waits on the next cached snapshot of the website.

My sense is that the reinclusion request is not needed - that the calculation is done algorithmically - and so the lifting (IMHO) would correspond with the date that Google caches the new changes and adjusts rankings based on that.

If it takes 6 weeks for Google spiders to hit your site and then cache the affected page with the newest changes, then I believe that is how to it would take for the penalty to be lifted if the penalty occurred because of on-site issues such as keyword repetition in the nav.

SEOPTI




msg:3515480
 7:08 pm on Nov 28, 2007 (gmt 0)

Karma, yes since 28th Oct 50% of my sites also went down the toilet (-950) but I don't think they calculate a (text - links) ratio.

I think it was an improvement with their co-occurrance filters.

lorien1973




msg:3515590
 8:07 pm on Nov 28, 2007 (gmt 0)


I think it was an improvement with their co-occurrance filters.

I've heard of this several times on here but I'm not sure if I totally understand co-occurrence or not. Is it picking apart groups of text to see if (anchor text) is related to adjacent (or nearby) terms?

tedster




msg:3515636
 9:04 pm on Nov 28, 2007 (gmt 0)

Term co-occurence is an information retrieval measure of what other words commonly occur in a document along with a given keyword phrase. This doesn't mean stemmed versions or morphology related versions of the keywords, but completely different terms that nevertheless have a meaningful, semantic relationship to the original keyword. Spam can sometimes be detected because the document goes way too far in either direction: 1) no natural co-occurring terms (stuffing) or 2) way too many (autogenerated pages of scraper snippets can do this.)

Co-occurence doesn't just look at anchor text, although with the Google algo being so sensitive to links, I'm sure that they look there very closely. Google's approach is an evolution of traditional co-occurence measures which actually pre-dte the web,

Check this thread for a fuller discussion:
Phrase Based Multiple Indexing and Keyword Co-Occurrence [webmasterworld.com]

nippi




msg:3515724
 10:47 pm on Nov 28, 2007 (gmt 0)

absolutely, do a reinclusion request. If someone at google is able to check your site and it is marked by google as

(1) being subject to a 950 penalty
(2) recheck in 30 days

If you can instead ask google to press the "re-check it now" button.... then you will be out faster.

DOn;t know if this is what happens for sure, but seems to be my experience.

Some filters like +30 seem to have a definite minimum period where the filter is left applied no matter what... whereas the 950 seems to have no such time period.

steveb




msg:3515758
 12:27 am on Nov 29, 2007 (gmt 0)

A reinclusion request for a 950 penalty will normally be inappropriate. You don't do anything "wrong" to get a 950 penalty. Reinclusion requests are only available to deliberate rule breakers. (That doesn't mean you can't do one, or that it won't work.)

potentialgeek




msg:3515808
 2:01 am on Nov 29, 2007 (gmt 0)

absolutely, do a reinclusion request. If someone at google is able to check your site.

"If" being the key word.

Are all reinclusion requests reviewed by a person instead of a computer? I'd like to believe somebody checks through every single one, but it seems like wishful thinking at a company which is understaffed in obvious places, leading to boilerplate email replies, etc.

If you imagine how many requests they get, and guess how many staff they can allocate to checking reinclusion requests, and how long it'll take to look at every single site...

How do we know that the req review management isn't automated? I don't see why a human review would even be necessary; the algo can quickly figure out if the changes were made!? Why get a human to do what a computer can do?

Or do we believe the admission of guilt is the big issue and only discernible by a "priest" like Google? ;/ I think the whole mea cupla setup is patronizing and silly.

You know when you Add URL at Google, it has a comments box? You think someone at Google reads that before deciding to crawl your site?

Did anyone ever submit a RR and get a non-canned email response from Google to it (to indicate human reviewing)?

Did it say something like, "We see you made important changes, but don't feel like the apology is fully sincere... please try again."

(Like George Costanza on Seinfeld.)

p/g

lorien1973




msg:3516281
 4:19 pm on Nov 29, 2007 (gmt 0)


Did anyone ever submit a RR and get a non-canned email response from Google to it (to indicate human reviewing)?

Did it say something like, "We see you made important changes, but don't feel like the apology is fully sincere... please try again."


In 2005, I submitted one and got a human response. I actually had a conversation of about 3-4 emails with a google "engineer" who took a look at my site. He wasn't helpful and I got the feeling he was just going through the motions (or totally uninformed) but at least it was a human.

The only emails I get from google now are from adsense asking me to put more adsense blocks on my site (I'm using 1 instead of the allowed 3 - the horror!).

Karma




msg:3516291
 4:26 pm on Nov 29, 2007 (gmt 0)

In 2005, I submitted one and got a human response.

Three years is a long-time, so I'm not holding my breath for a human response but I'll let ya'll know if I do.

2) way too many (autogenerated pages of scraper snippets can do this.)

Due to the nature of my site, this could well apply to me and theres not a great deal I can do about it for now. Any ideas on what I can do to these pages? Can I get the Google penalty algo to ignore them?

petehall




msg:3519283
 2:12 pm on Dec 3, 2007 (gmt 0)

Site now showing signs of lifting on certain phrases - encouraging but not too excited as yet.

SEOPTI




msg:3519694
 10:07 pm on Dec 3, 2007 (gmt 0)

No sign of recovery here .... seems stale.

potentialgeek




msg:3519870
 3:48 am on Dec 4, 2007 (gmt 0)

Is the standard to get the 950 penalty lifted much higher than the standard to get 950'd?

Seems like you could do 25 things to get 950'd, but if you correct all 25, it's not enough. Maybe 25 and then another 25 "corrections."

So what do you have to do? Keep stripping more and more "SEO" parts until it's lifted?! Hoping all the while by the time it's lifted the site still functions.

p/g

petehall




msg:3520038
 11:37 am on Dec 4, 2007 (gmt 0)

Back into oblivion - perhaps you needed to be fast to see it.

Karma




msg:3520277
 5:21 pm on Dec 4, 2007 (gmt 0)

Signs of recovery here, not sure if it's due to the reinclusion request or the many SEO (reduction) changes that I've made.

Will keep you posted.

potentialgeek




msg:3520752
 9:33 am on Dec 5, 2007 (gmt 0)

"De-optimizing" continues on one of my sites. It's the most boring web work I've done in years. Ugh. Not knowing if it's going to be effective, or "stick" if it is effective for a while, doesn't help, either.

I guess what I'm learning is to try and think whenenver building a site, "Could Google possibly interpret this site as spam now or in the future?"

If I'd done that before (which is easy to miss not seeing the forest from the trees), I wouldn't be in this mess now.

p/g

steveb




msg:3521232
 10:15 pm on Dec 5, 2007 (gmt 0)

"Could Google possibly interpret this site as spam now or in the future?"

I'd consider that important only if you do spam.

The important question is:
"Could Google possibly MIS-interpret this site as spam now or in the future?"

This is by far a more challenging problem. Because of how poorly and randomly this penalty is applied, we now have to consider the risk of whether linking to a page internally as "Iraqi war" when we have a bunch of external links pointing to it with the text of "Iraq war". It's an entirely absurd exercise, but now an important one.

SEOPTI




msg:3521374
 2:25 am on Dec 6, 2007 (gmt 0)

I'm still trying to find out why some sites enjoy this penalty sitewide starting from the root index and other sites just for certain phrases.

Maybe the sites which are affected for certain phrases just present the problematic URLs later in the navigation and not linked directly from the index document?

ALbino




msg:3521386
 3:05 am on Dec 6, 2007 (gmt 0)

I still can't believe nobody has solved this problem. Maybe we should pool our resources and just bribe some Google employee for the answer? :) The amount of money my site has lost due to this -950 problem is more than most Google employee salaries. It's out of control, with no end in sight.

[edited by: ALbino at 3:05 am (utc) on Dec. 6, 2007]

SEOPTI




msg:3521392
 3:27 am on Dec 6, 2007 (gmt 0)

Google should check local sites which present local companies in yellow pages style.

Those sites have been hit heavily. A business name should not be counted for co-occurrance.

This is false and a weakness of their -950 penalty.

tedster




msg:3521527
 9:49 am on Dec 6, 2007 (gmt 0)

I'm still trying to find out why some sites enjoy this penalty sitewide starting from the root index and other sites just for certain phrases.

I wish I had even a whisper of an answer for that. I think it's a most important question.

It could be that there are two entirely different situations being addressed by one "penalty process", but that's just a wild guess and it doesn't say anything very helpful.

Phrase based indexing looks to be a likely candidate for the "certain phrases" variety of the -950. But why would Google treat an entire domain with end-of-results re-ranking when even some very intense link manipulations have only earned the domain a -30?

potentialgeek




msg:3521639
 2:22 pm on Dec 6, 2007 (gmt 0)

I don't think Google engineers put a lot of thought into the 950 penalty. Like other coding glitches lately, it was just another "brainwave" that was rushed to implementation.

An algo that sets internal linking or anchor text links as 100X more important than the site's age, trust rank for years, navigation, inbound links, etc., is completely absurd.

Google's own webmaster guidelines say a site's navigation should be easy. Repeated keywords in a site's menu does not confuse visitors or cause navigation problems. Yet Google thinks it does; that's why the site gets whacked with a 950.

You can have broken links or dead links, but won't get 950d. But repeat keywords, which is irrelevant to navigating a site, and, whack!, you're sent to SERP hell.

Instead of Google simply ignoring the supposed intended value of link text duplication, the way it does by and large with keyword text, you get an extreme penalty instead.

I'm sure Google received 950,000 complaints about sites with keyword text repetition, and just as many from their psychiatrists, and decided, "Let's do a 950 Penalty."

Mind-numbing brilliance!

If Google could get back to the basics and focus on end-user experience, not some geek vendetta on SEO-crazed webmasters, we wouldn't have to waste time getting its ridiculous penalties lifted.

Did anyone ever think to consider that the 950 penalty is or is related to being sandboxed?

Not yet, but I'm open to the idea.

I bring this up because I have done a lot of research on this, and it seems that the "950 penalty" symptoms are very similar to the "sandbox" symptoms. The only major difference is that the sandbox is often discussed when a website is relatively new, but who says established sites can't be sandboxed anyway?

Explain? Some sites that weren't changed in years got 950d, and others that made a few changes got the 950 lifted quickly...

p/g

aok88




msg:3521696
 3:50 pm on Dec 6, 2007 (gmt 0)

I have had a very similar thing happen to me - we have a 7-year old retail site (1000+ pages indexed) with no AdSense that scored on Page 1 for years for hundreds of terms. A couple of days ago, all the terms now all of a sudden show up on page 5 or 7, not 950+, but 50 to 70, which might as well be 950.

One odd thing about this site is that the homepage and a number of first level directory pages are about another, unrelated topic/industry. So this site that just plummeted in the SERPs actually has two unrelated, different industries on the same domain. The subject that has seen all its keywords badly affected makes up the bulk of the pages. The subject that has NOT seen any rankings changes for the worse is on the homepage and about 30 first level pages, and that's it.

The strange thing is, the keyword phrases found on the homepage and related inner pages all held on to their rankings.

Another site of ours, with pretty much all the same stats, including the exact same industry has not been affected at all. But this site only has one subject on its domain.

I read others here who have said that this 'penalty' could affect sites that have different topics on the same domain. Anyone have any thoughts on my situation?

randle




msg:3521722
 4:28 pm on Dec 6, 2007 (gmt 0)

I don't think Google engineers put a lot of thought into the 950 penalty.

I don’t think they did either, but they have put in a great deal of thought into penalties in general. Ever since Florida it seems whenever they need a new fix they add another penalty element into the algorithm (some being quite subtle, others like the 950 quite severe). In stead of adding up all the “signals of quality” (add up the good and ignore the bad) and then ranking everyone (like in the old days), now it’s just as much about ranking you somewhere because you tripped some sort of threshold, (or unfortunately a perceived threshold).

lorien1973




msg:3521772
 5:28 pm on Dec 6, 2007 (gmt 0)


A couple of days ago, all the terms now all of a sudden show up on page 5 or 7, not 950+, but 50 to 70, which might as well be 950.

Welcome to my world.


I read others here who have said that this 'penalty' could affect sites that have different topics on the same domain. Anyone have any thoughts on my situation?

It could be a topical deal. I have 2 sites. One is unaffected at all. The unaffected site would be considered a single "topic" domain as every other domain that is similar has the exact same set of keywords (due to the nature of the products). The site that got whacked is more broadly focused - hitting on hundreds of different keywords that, I don't think, fit under a single niche.


I don't think Google engineers put a lot of thought into the 950 penalty.

I don't think Google put a lot of thought into the site links either. Ramping up domain name relevance to push site links out there has rewarded doorway pages and domain name squatters with no content, besides MFA. I think the site links is an experiment that needs to be dumped.


I don’t think they did either, but they have put in a great deal of thought into penalties in general.

I think you are right. Google (it seems) used to wield it's algorithm like a surgical knife. Cutting away only at what is necessary. Now, they seem to wield it like a sledgehammer. I understand the reasoning - who is going to miss "joe bob's widget site" when "jim bob's widget site" just replaces it - but still, it really puts the sites that get hit (for little or no apparent reason) in a bind.

aok88




msg:3521787
 5:59 pm on Dec 6, 2007 (gmt 0)

lorien1973,

Did you do anything to the site that got whacked? If it has a lot of topics, is there a way for you to make it more consistent and then see if that fixes it?

Anyone else think this could be a topical thing?

lorien1973




msg:3521803
 6:09 pm on Dec 6, 2007 (gmt 0)


Did you do anything to the site that got whacked?

Before or after?

Before it got whacked, we completed a redesign on the site. I do not believe that to be the cause, though as the content of the site stayed the same.

Afterwards, I found a lot of my content was stolen. We've since replaced that. I've "de-optimized" pages that may appear to be overoptimized. I've also reduced the sitemap to show only ~150 links and have 5 sitemap pages, instead of 4 sitemap pages showing ~2000 links each - also changed what is being linked to (no products, only major categories).


If it has a lot of topics, is there a way for you to make it more consistent and then see if that fixes it?

It's hard to explain without getting another post deleted; but really, No there isn't, without opening up 5 more websites (at least). The niche is pretty broad and I'm trying to expand the product lines in every department - so it's only going to get bigger, not smaller. I do not feel like Google should punish me for carrying a wide variety of items.

It'd be different if I were an article directory with articles on drugs and knitting. I'd get it. But it's not that disparate.

I can prove, though, that google is placing -way- too much importance on domain names, though. So it may not even be anything to do with your site; just the domain name relevance is ramped up so high that you were pushed down because of it.

europeforvisitors




msg:3521861
 7:30 pm on Dec 6, 2007 (gmt 0)

Google's own webmaster guidelines say a site's navigation should be easy. Repeated keywords in a site's menu does not confuse visitors or cause navigation problems. Yet Google thinks it does; that's why the site gets whacked with a 950.

I suspect that, if keyword repetition is leading to 950 penalties, it's doing so in combination with other factors or when it's carried to a ridiculous extreme. (By "ridiculous extreme," I mean something like a column of 50 links about different types of widgets, each with the word "widgets" in the anchor text. In any case, any rules about keyword repetition are likely to be based on good and bad examples, so Webmasters who might be erring on the "bad example" side may want to look at their navigation schemes objectively and think about whether their links would pass a reasonable observer's sniff test.)

ALbino




msg:3521875
 7:44 pm on Dec 6, 2007 (gmt 0)

Let's say it's keyword repetition that triggers it for a minute. In my case I have a page about widgets (first-tier) and subpages for Big Widget, Small Widget, Long Widget, Tall Widget (second-tier). Each of those pages has ~20 items with links to a third-tier of pages that are item specific. The page that is -950 is the second-tier with a page that reads like:

----------

Small Widget with blue thingamabobs from SomeCompany.

Small Widget with soft thingamajigs from SomeCompany.

Small Widget with purple whozits from SomebodyElse.

Small Widget with reflective whozits from SomebodyElse.

----------

The "phrase" that is -950'd is the second-tier phrase "Small Widget"/"Big Widget"/"Long Widget"/"Tall Widget". The individual pages for each of those items isn't penalized, only the overview "Small Widget" page.

To remove "Small Widget" from each item description would be to remove it from the item-specific third-tier page as well. Or, I suppose I could make static pages that just said "With blue thingamabobs from SomeCompany." but I think that would be contextually confusing for the user and would read poorly.

Anyway, that's the problem on my site overall. Is this mirrored by anybody else? Would removing "Small Widget" from every item description be helpful?

SEOPTI




msg:3522891
 2:41 am on Dec 8, 2007 (gmt 0)

ALbino, you could remove 'small widghet' easily by using str_replace (php) or you could just list less items per URL, 10 or 15 instead of 20.

I have reduced the number of items on all of my sites (still waiting for results). Disadvantage: your pagination becomes larger in this case.

This -950 critter just loves URLs with a lot of links. It eats the links and punishes the URLs, so I think less links per page are a better solution, it also means less text per page which might reduce co-occurrance.

ALbino




msg:3522930
 4:44 am on Dec 8, 2007 (gmt 0)

My site is -950'd for pages that only have 1 item, so I just don't think it's keyword repetition. For example one -950'd second-tier page looks like this:

Title: "My Site Name - Small Widget"

Main Body Text:

"Small Widget

Browse ¦ S ¦ Small Widget

About Widget:
Small Widget is made up of blah, blah, blah, blah

Items:
Small Widget with brown whozawhatzits from ThisCompany."

And that's it. I mean, there's a navbar at the bottom and some text info boxes, but nothing that in any way relates to the -950'd phrase "Small Widget" -- they're just site information. So is repeating a two-word combo 4 times on the page and 1 time in the page title really strong enough to trigger a -950 penalty? I just can't imagine keyword repetition is the cause... at least in this case.

[edited by: ALbino at 4:45 am (utc) on Dec. 8, 2007]

This 199 message thread spans 7 pages: < < 199 ( 1 2 3 [4] 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved