homepage Welcome to WebmasterWorld Guest from 54.227.141.230
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Website
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 194 message thread spans 7 pages: < < 194 ( 1 2 3 4 5 [6] 7 > >     
Google's 950 Penalty - Part 7
Marcia




msg:3310738
 10:18 pm on Apr 13, 2007 (gmt 0)

< continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

trinorthlighting
Annej,
I read in another thread that you wrote that you have a recip links page. That is probably what is causing your site some grief.

No, it certainly is not. annej's SITE is not having any grief whatsoever. There are simply some individual PAGES that are not ranking for the chosen keywords.

In addition, having reciprocal links (or a recip links page, or even a whole directory with links) is NOT what causes this phenomenon. There are sites with reciprocal link pages and even directories with a percentage of recips that are untouched and have top-notch rankings. And that is a verifiable fact.

Remember, the algo is completely automated with very little human input. You probably need to take a long hard look at who your linking to and if they are spamming.

This has nothing whatsoever to do with OBLs and nothing whatsoever to do with link spam.

Remember, Google guidelines state not to have your site link to bad neighborhoods. If one of the sites you are linking to is spamming Google, it can have a drastic effect on your site. Check to see if all the sites you link to are following Google guidelines. If they are not, you might want to drop that particular link.

Linking out to ONE? Did I read that right and/or interpret that correctly? Or am I seeing things? Where in the world did that theory come from?

If a site is SPAMMING by a pattern of linking out to bad neighborhoods, it'll cause a problem with the SITE - not individual content pages that are simply not ranking. This is not the case, not by any means.

I don't know how many times it has to be repeated and requested to please not try to accuse anyone with this phenomenon of somehow spamming, because there's no basis in reality and it can cause unnecessary stress that's unfounded and unjustified and without basis. Trying to help is always appreciated, but this is serious, it's no place for folks to be chasing windmills.

[edited by: tedster at 9:16 pm (utc) on Feb. 27, 2008]

 

zeus




msg:3317340
 12:00 am on Apr 21, 2007 (gmt 0)

as many has said here, we see sites, with H1, heavy link trades, lots of keywords and much more at the top and this is what we have talked about could be wrong with those sites in 950 filter, but how about this:

You are a adsense member, you have adsense on all your sites, you have more sites be cause you need a safty net, if google plays around so at least one or 2 sites still rank and you can pay your bills, but Google uses there adsense to see which sites you own, if you own more in the same category, then only 1 of those sites rank with all pages and it dossent matter if the one is about all widgets and the other is specialy about blue widgets.

Marcia




msg:3317454
 5:44 am on Apr 21, 2007 (gmt 0)

Problem is it's not that long ago that reciprocal/exchanged links were being lumped together with bought links by both Google and MC.

IMO there's a difference between reciprocal and exchanged links, the former can be accidental, the latter not.


Absolutely in agreement. There's a big difference between mass_production and selectivity. Distinguishing and defining differences is an important thing to do - and a lot depends on individuality, editorial judgment, theme relevancy, quantity, and rate of accrual over time.

There is a BIG difference between a link exchange scheme and high quality, appropriate reciprocal links, done selectively and editorially chosen, between like-minded quality sites in the same/similar niche catering to the same demographic audience.

Would Adam endorse exchanged links?

glengara, I'm not a mind-reader and can't speak for Adam, but I can't see anyone from any search engine endorsing a mass-produced, indicriminate link exchange program/scheme. But I believe the post done in the Google group (previously quoted here) adequately addressed the issue of Google's attitude toward the the traditional practice of legitimate reciprocal links, as traditionally done, in proper balance.

We can all make our own choices for ourselves whether we want to listen to and trust folks who have been around since link-gaming started and/or what's publicly stated by someone who's designated and trusted as an official spokesman and some person(s) stating different, contrary opinions and handing out misinformed, irrelevant misconceptions.

LineOfSight




msg:3317502
 8:03 am on Apr 21, 2007 (gmt 0)

before I post what I did I would like to look a few affected sites and try the theory on those

Good to hear you've had a result but why the secret squirel act? There are other posters here that have published their theories after getting their positions back in an attempt to try and find what's causing this.

glengara




msg:3317509
 8:27 am on Apr 21, 2007 (gmt 0)

You're right M, but you've been reading the Runes for a while now :-)

A lot of people seem to assume reciprocal and exchanged are the same thing, so when they read of a G spokesman broadly endorsing reciprocals....

egomaniac




msg:3317664
 3:04 pm on Apr 21, 2007 (gmt 0)

Cautious good news here...

My site appears to have stabilized, and has stopped going from top 15 to 965 every few days.

I made some changes 7 days ago, and it hasn't done a 950 on me since. I cleaned up the spamminess of my breadcrumb links, added a few pages, changed a few internal links and consolidated my reciprocal links "directory" into a single short "resources" page of only recip links of quality (plus a few non-recip's).

The biggest change I think that I made was the breadcrumb links.

I'll post here again if I start getting the 950 hit again, but I think I may have gotten myself off the edge of this filter.

LineOfSight




msg:3317672
 3:39 pm on Apr 21, 2007 (gmt 0)

That's great news - can you just clarify what you mean by cleaning up the spamiest breadcrumb links. Were these just main keywords that pointed back to the previous page or were they 'stuffed' or what. Without giving too much away, can you give an example using the widget factor :) Thanks

annej




msg:3317827
 7:24 pm on Apr 21, 2007 (gmt 0)

I'm not clear on how breadcrumbs could be spammy. Please explain. I'm concerned about setting my navigation so that is is not considered at all spammy but is still helping my visitors find what they are interested in.

thinks that the same page is spam for certain keywords

I don't mind if Google thinks a page that is just slightly related belongs at 950 (thought I really don't understand why they would rank a page like that at all). But it's a serious problem when they send a page that is exactly on the topic the searcher is looking for and they still 950 it.

tedster




msg:3317829
 7:32 pm on Apr 21, 2007 (gmt 0)

Spammy breadcrumbs -- sound like a recipe, doesn't it? I think I saw it in the Adwords in my gmail once.

Widgets Home > Widget Types > Big Widgets > Big Yellow Widgets

That kind of pattern might do it, no? Just as the same labels might hurt in a regular menu.

Marcia




msg:3317844
 8:10 pm on Apr 21, 2007 (gmt 0)

>spammy breadcrumbs

Here's how it's done:

[webmasterworld.com...]

I've got a site like that, never changed it because it did well in other engines, first one and then another. But "demerits" for excessive use of internal anchor text, with repetitions on the same page reared its head at the time of the Florida update, which is when that site got hit. Another was corrected (with another modification or two to raise "hub" and topical relevancy for the site) and came back, but that one that kept the spammy navigation never came back at Google.

It's nothing new, I've seen it and have been mentioning it for four years.

We can sometimes forget that with well over 100 factors in the algo, any factor - or combination of factors - can cause a rankings drop. There's little that's new under the sun, but we are seeing something now that's "new and different" and re-hashing same ole' same ole' that's been going on for years - excesses, link stuff, etc., - only detracts from digging in and figuring out whaat this strange new thing is.

Marcia




msg:3317866
 9:00 pm on Apr 21, 2007 (gmt 0)

Here's an illustration:

[google.com...]

That wasn't the first instance either, nor the first "link exchange program." How long ago was that, how many years ago?

egomaniac




msg:3317871
 9:07 pm on Apr 21, 2007 (gmt 0)

Tedster pretty much nailed it...

Widgets Home > Widget Types > Big Widgets > Big Yellow Widgets

Mine were like this...

Blue Widgets > Blue Widget Programs > Blue Widget Courses > Blue Widget Book

A few were really as bad as that.

I changed it to...

Blue Widgets > Programs > Courses > Book

Having the repeated double keyword seems to be the key offense (as Marcia pointed out).

I have some other subdomains which have had no problems. They do have breadcrumbs, and they do link back to this main domain, but they don't have any phrase repeated in the breadcrumb like the above.

These are subdomains are like this...

Blue Widgets > Blue Doodles > Blue Wierd Doodles

... and that seems to be OK. I haven't touched those.

tedster




msg:3317902
 10:30 pm on Apr 21, 2007 (gmt 0)

Blue Widgets > Blue Doodles > Blue Wierd Doodles

... and that seems to be OK. I haven't touched those.

That makes sense. I would expect problems to come from over-using the more semantically specific words, and not from repeating the more general ones.

proboscis




msg:3318389
 8:34 pm on Apr 22, 2007 (gmt 0)

If you're 950'd and you search for your page, do you see it at the end of the results or do you have to click "repeat the search with the omitted results included" in order to see it?

And if you click it does your page show up back in it's original position or is it at the end of the results?

ALbino




msg:3318474
 11:05 pm on Apr 22, 2007 (gmt 0)

I believe having to click for omitted results is a duplicate content problem, and not related to -950.

tedster




msg:3318486
 11:29 pm on Apr 22, 2007 (gmt 0)

Agreed - the affected result shows up near the end of the results without clicking on an "Omitted Results" link. So if there are only 450 results before that link in a particular search, then the affected result might show up at 430 or so. It's near the end of results, and not always at #950.

[edited by: tedster at 12:20 am (utc) on April 23, 2007]

Marcia




msg:3318500
 11:46 pm on Apr 22, 2007 (gmt 0)

Here's where Andrei Broder first talked about "shingles" (first as far as I know, anyway). US Patent # 6119124

Method for clustering closely resembling data objects [patft.uspto.gov]

He does make mention of duplicates, but there is a difference between duplicate, near-duplicate, and resembling other documents closely - within the same site. The "click for more results" is taking off the "clustering" filter, which then shows the pages first omitted or "clustered out" but mentions documents being similar - which is resembling others.

I have sites with pages "similar" but there's nothing duplicate about them. It isn't duplicate content, it's redundancy. I've been suspecting for a long time (and have been mentioning it in threads, too) that what's first picked up, or at the beginning of the page/code for documents during the process of returning the results for queries is looked at for similarity - hence the importance of unique page titles and descriptions, and/or what's first in (top of) code that'll be picked up for snippets. I still believe it, even more so now.

I do believe there's a close correlation between the process for snippet generation and clustering of "similar" documents. According to some papers they're both done at query time and are query dependent.

<edited for clarification>

[edited by: Marcia at 12:00 am (utc) on April 23, 2007]

LineOfSight




msg:3318776
 11:13 am on Apr 23, 2007 (gmt 0)

Marcia

Method for clustering closely resembling data objects

Reading the patent again suggest that these pages are group based on the similarities of 'finger print' that is associated with the clustering. Do you believe that by modifying and changing pages (so they are completely different) where similarities have occured that the pages can be de-clustered?

Nick0r




msg:3319167
 6:52 pm on Apr 23, 2007 (gmt 0)

There are two main choices to "bust out" pages from this penalty.

A) To de-optimise. You need to get your pages so that they don't hit the N phrase count needed to throw your pages to -9xx. This is NOT a good fix, as you will find your pages ranking significantly lower than before due to keyword density being thrashed.

B) To use the alternative remedying factors to save your pages from being phrase based re-ranked. There are a few ways a page can be saved from being evaluated by this system, and a lot of it is to do with the themes of your pages, as Miamacs has explained.

About the whole directories being dropped - that is usually to do with one page (on a specific theme) carrying the rest of the directory down. If that one is successfully fixed, the rest should be fine too.

ALbino




msg:3319187
 7:19 pm on Apr 23, 2007 (gmt 0)

B) To use the alternative remedying factors to save your pages from being phrase based re-ranked. There are a few ways a page can be saved from being evaluated by this system, and a lot of it is to do with the themes of your pages, as Miamacs has explained.

Care to elaborate a bit more on this?

MrStitch




msg:3319306
 9:13 pm on Apr 23, 2007 (gmt 0)

Whoa... about that structure thing again.

My directories on one of my problem sites aren't nearly as deep. However the structure would be like this -

www.example.com/blue-widgets/blue-widgets.htm

Would something like this easily trip the filter?

annej




msg:3319616
 5:03 am on Apr 24, 2007 (gmt 0)

I understand what you mean by spammy bread crumbs now. That is something I've learned to watch myself on. If your site is about say, "old widgets", it's easy to write your navigation with a lot of repetition of the phrase simply because that is what the site is about.

LineOfSight




msg:3319653
 6:21 am on Apr 24, 2007 (gmt 0)

annej

These are 2 different thing points that have been discussed though, breadcrubs and navigation, aren't they?

1. Aren't the basics that have been discussed here that breadcrumbs need be very concise following the exact user route without repetitive use of keywords / phrases and

2. Navigation links can contain keywords that point to pages that talk specifically about the subject relating to those keywords in the navigation, ensuring that phrased based and co-occurence filters are not tripped by the destination pages.

Or have read too much on this topic now and just confused myself :-)

annej




msg:3319993
 2:26 pm on Apr 24, 2007 (gmt 0)

My mistake was in setting up my navigation as

widgeting
widgeting antiques
widgeting patterns
etc

I realize it really wasn't necessary to repeat the word widgeting as the visitors know that's what the site is about.

BUT in the case of the phase based aspect of the 950 situation I suspect that some phrases can be damaging whether repeated or not. I think this is another issue.

The problem is that you can never be sure if the changes you make brought the page or pages back or if Google just shifted the filter. But a while ago I took a phrase out of my navigation that occurs a lot in MFAs and it seemed to bring the page back. Luckily there was another phrase that described this topic.

I think in some cases the navigation can very accurately point to a page about a topic and the topic has the same phrase in it, even in the title but you still trigger a phrase based flag. This is why this is all so frustrating.

added - Go back in this thread to read and reread Miamacs message #3311620. It shows the different levels of how complicated this puzzle can be.


LineOfSight




msg:3320158
 4:06 pm on Apr 24, 2007 (gmt 0)

occurs a lot in MFAs

MFAs?

MrStitch




msg:3320170
 4:11 pm on Apr 24, 2007 (gmt 0)

MFA = Made For AdSense

I'm sure you've seen those sites. It's nothing more than a simple nav that reloads adsense data. In the end, there is no information.... just adsense links to click on.

LineOfSight




msg:3320171
 4:13 pm on Apr 24, 2007 (gmt 0)

Got it - thanks :-)

lasko




msg:3320934
 6:34 am on Apr 25, 2007 (gmt 0)

Morning Everyone.

I've been monitoring this thread since the start and although being virtually white hat I was affected by the sudden drop in rankings for certain pages and phrases.

I've made some adjustments to the site but nothing seemed to work enough that would make a big difference.

I would say only one page has started to appear in the top 3 pages of Google but hey its a start.

Today I took a fresh look and realised that on some of the pages, I had written a php script that was pulling in data from a database and output links to internal pages not external web sites.

At the top of the page I was buidling anchor links:


Category Keyword Phrase 1 (#Category Keyword Phrase 1)
Category Keyword Phrase 2 (#Category Keyword Phrase 2)
Category Keyword Phrase 3 (#Category Keyword Phrase 3)

Then further down the page relevant content links would appear under each category like this

------------------------------

Category Keyword Phrase 1 -
page A.htm
Page B.htm

Back to top of Page link

------------------------------

Category Keyword Phrase 2 -
page C.htm
Page D.htm

Back to top of Page link

------------------------------


Now this was done 3 years ago and all was fine then but I'm betting this is where my problem exists.

As for breadcrumbs I think the same thing could happen and thats why we're all experiencing different problems and the logic seems to be a bit cloudy.

So another solution I'm trying, remove the keyphrases in the #anchor links as its this specific phrase that has dropped in rankings.

I also checked on competitors and they also had similar #anchor keyphrase links and they've dropped as well.

As I said before most of the site is white hat, no hidden text no misleading web pages but its about avioding mistakes these days.

If we find a solution then it could be a positive thing for SEO business as webmasters will have to be more careful then ever before to build good web sites that avoid being dropped through careless mistakes.


Added -

Given what others have experienced and reported we can be grateful or happy that it appears to be a temporary downgrade and that rankings could recover quickly.

Not like a couple of years back when you had to wait 3 months!


zeus




msg:3321094
 11:25 am on Apr 25, 2007 (gmt 0)

Now with all this, can you remember the good old days where you did not have to make your site for google, but where you had free hands to create a site you want. This bread crum stuff is just pure cr.., you can even place your company info on every page, be cause google says so, I dont use bread crums, but still I think its cr..

Jessica




msg:3322215
 8:45 am on Apr 26, 2007 (gmt 0)

Can anyone write a very brief summary of this huge 7 part thread?

What things casue "950 penalty"?

lasko




msg:3322303
 11:02 am on Apr 26, 2007 (gmt 0)

What things casue "950 penalty"?

Unfortunately thats what we're all still trying to work out.

It does look like excessive keyword phrases in links throughout a website including links to external websites is the cause. In breadcrumb navigation or not.

Some people are mention breadcrumbs is to blame but I would say it makes no difference. Google just see's links with keyphrases, perhaps if its too much then it could be tagged as a method of Google Bombing.

Just keep trying to find the answers.

tedster




msg:3322777
 6:15 pm on Apr 26, 2007 (gmt 0)

There's an interesting report from errorsamac in this thread:
[webmasterworld.com...]

errorsamac:
I'm in the process of testing this out, but I think I have found a flaw in how Google
is counting links and/or preventing Google bombs. If you have a high PR site
(PR7 for example) and you link to a normal, non-authority, PR3 site, you can take
that site out of the SERPs for the particular keyword or phrase. I just tested it on
one of my sites that I don't care about...

tedster:
...did the url you targeted go to "end of results" -- as in what people
call the "-950 penalty"?

errorsamac:
Yes, I just checked and both sites are hit with the -950 penalty....
...as far as I know, there is no known way to recover from this
(other than to get good links to offset the bad links)


This 194 message thread spans 7 pages: < < 194 ( 1 2 3 4 5 [6] 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved