homepage Welcome to WebmasterWorld Guest from 54.211.138.180
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 194 message thread spans 7 pages: < < 194 ( 1 2 [3] 4 5 6 7 > >     
Google's 950 Penalty - Part 7
Marcia




msg:3310738
 10:18 pm on Apr 13, 2007 (gmt 0)

< continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

trinorthlighting
Annej,
I read in another thread that you wrote that you have a recip links page. That is probably what is causing your site some grief.

No, it certainly is not. annej's SITE is not having any grief whatsoever. There are simply some individual PAGES that are not ranking for the chosen keywords.

In addition, having reciprocal links (or a recip links page, or even a whole directory with links) is NOT what causes this phenomenon. There are sites with reciprocal link pages and even directories with a percentage of recips that are untouched and have top-notch rankings. And that is a verifiable fact.

Remember, the algo is completely automated with very little human input. You probably need to take a long hard look at who your linking to and if they are spamming.

This has nothing whatsoever to do with OBLs and nothing whatsoever to do with link spam.

Remember, Google guidelines state not to have your site link to bad neighborhoods. If one of the sites you are linking to is spamming Google, it can have a drastic effect on your site. Check to see if all the sites you link to are following Google guidelines. If they are not, you might want to drop that particular link.

Linking out to ONE? Did I read that right and/or interpret that correctly? Or am I seeing things? Where in the world did that theory come from?

If a site is SPAMMING by a pattern of linking out to bad neighborhoods, it'll cause a problem with the SITE - not individual content pages that are simply not ranking. This is not the case, not by any means.

I don't know how many times it has to be repeated and requested to please not try to accuse anyone with this phenomenon of somehow spamming, because there's no basis in reality and it can cause unnecessary stress that's unfounded and unjustified and without basis. Trying to help is always appreciated, but this is serious, it's no place for folks to be chasing windmills.

[edited by: tedster at 9:16 pm (utc) on Feb. 27, 2008]

 

foxtunes




msg:3313049
 12:43 am on Apr 17, 2007 (gmt 0)

Trakkerguy - Was hit on April 10th. Still ranking for a few obscure long tails but 90 percent of Google traffic wiped out.

glengara




msg:3313285
 10:06 am on Apr 17, 2007 (gmt 0)

Latecomer to the thread...950 is quite a hit for it not to be linkage related, OTOH keeping it within the 1000 would seem to indicate it's not your traditional links scheme penalty....

Marcia




msg:3313291
 10:21 am on Apr 17, 2007 (gmt 0)

>>quite a hit for it not to be linkage related,

There are tons and tons of Ebay, Amazon, MSN Shopping, Bizrate, etc. pages down there, (as well as some normal ecom sites) that have clusters of pages in the 900's, 4-500's, 700's - all from the same site, clusters of pages with the same (or very close) phrase targeting. Some sites that have page one rankings, in some cases #1 rankings - for the same keyphrase. It isn't links in those cases.

There's just so much storage they're gonna use for a site, and for all practical appearances it looks like redundancies are being demoted to get them out of the way, yet still keep them in the index.

I have seen some cases that seem to be link related, one in particular that's been link spamming since before Florida. I've always watched and wondered when they'd get hit - they're the very last site listed in the main index for the primary keyphrase.

IMHO it's not all for the same reason, but there's an elusive "something" that's hitting some (many) sites that's different and we haven't quite gotten our heads wrapped around it yet. There's a LOT in those patents that seems to apply in a lot of cases and explain a lot, but we're still skirting around the edges.

>>not your traditional links scheme penalty....

It's very easy with Google to assume most everything is due to linking, but I really do think semantic, phrase-based and co-occurrence factors are coming into play, and that's something the individual webmaster really doesn't have control over, or at best it would be very little, because some of the factors are derived from figures that involve the whole index.

[edited by: Marcia at 10:32 am (utc) on April 17, 2007]

glengara




msg:3313306
 10:34 am on Apr 17, 2007 (gmt 0)

*..it's not all for the same reason..*

You could well be right, I've always suspected they added other factors to the mix to help mask their algo tweaks.

Miamacs




msg:3313351
 11:47 am on Apr 17, 2007 (gmt 0)

Okay so before we get into another round, why don't we simply conclude that... there is no "-950 penalty".

Before this term becomes alike to "the sandbox" , where a common, similar symptom was thought to be a single filter. No wonder it took one and a half years (!) for its different possible causes to become widespread knowledge.

There's clearly no SINGLE problem behind the phenomenon of seeing a site go -150, -350, -950. This is but a new kind of penalty box / warning that is applied. Only that it doesn't give away for WHAT it is applied for.

It'd make sense ( to confuse us all ), and again everyone is right. Let's try this on for a change:

- There's a phrase based penaltyfilterreranking that sends URLs (pages) -"950" for queries
- There may be a new, more strict linking profile check that sends SITES (domains) -XYZ for everything

We should've renamed the thread or start two different threads not to confuse the two. We've been reaching the same conclusions for the third time this month from thematic trustrank to too many colliding themes to anchor texts being too generic for a niche site. ( Oh wait, did I mention that before? )

...

I've been set on read-only mode for I didn't want to anger those who're not interested in the phrase-based stuff. I'm such a sensitive person.

lasko




msg:3313370
 12:14 pm on Apr 17, 2007 (gmt 0)


I too was hit with some kind of penalty which has never happened to me over the 5 years the sites been live.

I think alot has to do with the new external links within web pages and the use of rel="nofollow"

I have a block of pages that did well for long keyphrases which have disappeared.

At first I decided it was overuse of keyphrases within internal links but after turning it down a bit still not much change.

Then I noticed I had a link to a certain similar web site "Not Duplicated" on all these web pages which I've now removed.

My home page and other still rank highly in top 10 results for some phrases so its not a complete penalty.

Perhaps just a penalty against keyphrases on pages that have received a penalty that would make much more logical sense.

We'll just have to wait and see if anything improves.

LineOfSight




msg:3313399
 12:40 pm on Apr 17, 2007 (gmt 0)

clusters of pages with the same (or very close) phrase targeting

Marcia - I know you've mentioned the phrased based theory in numerous posts but can you clarify just how you see this working. Thanks

[edited by: LineOfSight at 12:41 pm (utc) on April 17, 2007]

annej




msg:3313529
 2:52 pm on Apr 17, 2007 (gmt 0)

I'm still trying to sort out how authority and hub scores relate to this 950 thing.

I found the Google Guy post in this thread. [webmasterworld.com...]

I also found this on the topic. [cs.cornell.edu...]

I'm mostly referring to the phrase based filter, not other possible causes. I really do think it is at the base of the penalty because it works page by page. That could be why many sites that were never touched before are have pages dropping to the end of the results. Before our overall site strength protected us but no more.

In my case my site is strong on both inbound and outbound links including some from edu and gov sites. BUT many of my individual pages are weak on inbound links because other sites generally link to one of two or three top pages on the site.

For example I had a subsection in which all pages dropped to the 950 region. A single good inbound link to the contents page of the section brought the whole section back.

So I think we are looking at a situation where the page's strength (authority, hub or both) can give some protection from the phrase based problem. Since sometimes it is almost impossible to sort out what words or phrases are causing the problem strengthening the page with even one or two good links might make the difference.

I'm wondering if one or two strong outgoing links might help as well but I haven't tried that.

jk3210




msg:3313641
 5:05 pm on Apr 17, 2007 (gmt 0)

With the Sandbox (which I'm proud to say I never believed in for one second), it eventually became clear what Google was trying to combat, and how they were attempting to accomplish it. With the -950, we still don't see what their goal is --what they're trying to combat.

As of now, there is no pattern of occurences that can be connected-up in a meaningful way that makes any sense. Sure, we know WHAT occurs, but they still appear to be unconnectable. Consider...

Some sites lose one directory...
Some sites lose 90% of eveything...
Some sites lose 90%, but keep the big-money terms...
Some sites lose the big-money terms, but keep the long-tails...
Some sites recover by reducing phrase density...
Some sites recover by reducing internal links...
Some sites recover by reducing phrase density within internal links...
Some sites recover with new inbounds...
Some sites don't recover with new inbounds...
Some sites recover with no changes at all...
Some sites recover in and out...
Some sites never recover at all...
Some sites never get hit at all...

Where's the pattern in all that?

annej says...
<<A single good inbound link to the contents page of the section brought the whole section back>>

Notice that word "good."

I've been *leaning toward* the idea that this -950 is being caused by a change in how trustrank values are distributed within a site via internal linkage, because that would seem to be a broad and reasonable goal that Google would want to pursue, i.e. how much trust should be placed on an internal link to a page that either has no external links (no "second opinions"), or only has untrusted external links.

And the word "good" in annej's post fits right in to prior observances that some pages recover with new externals and some don't. The difference would be the level of trust a new external link brings. It fits the pattern.

At some point, Google definitely made the transition from a "links, links, links" approach, to a "link trust, link trust, link trust" approach. But the flaw in that concept was that trusted authority sites appeared to have too much ranking power for terms they really had no business ranking for. Maybe Google is just now getting around to fixing that flaw and the effect of that fix is now working its way across the net.

Just an idea to consider.

trinorthlighting




msg:3313645
 5:17 pm on Apr 17, 2007 (gmt 0)

Thanks Jk, everyone is assuming this is about a very old phrased based patent (Which is good reading and google might have tweaked) that was implemented years ago.

This is about in bound/outbound links, downgrading of exchanges and also the no follow that was recently place on wiki pages.

jk3210




msg:3313695
 5:52 pm on Apr 17, 2007 (gmt 0)

But remember, a new way of determining how to assign a trustrank value to an internal link to a page with no other indicators available (like trusted external links) might only be their *goal* --how they accomplish that goal is something completely different. It could, and by definition, probably WOULD incorporate other inputs such as Phrased-based ranking analysis.

And all that is based on the vague assumption that trustrank values via internal links even their goal in the first place. Which I'm far from sure of.

Biggus_D




msg:3313713
 6:07 pm on Apr 17, 2007 (gmt 0)

What I do not get is this "sequence":

1. I get a "penalty" and loss a certain %
2. I keep adding content
3. The old natural growing due the new content just stops

Shouldn't I be seeing some sings of recovery due the new content?

It's not like I have a website that I update from time to time, this is updated daily.

[edited by: Biggus_D at 6:08 pm (utc) on April 17, 2007]

LineOfSight




msg:3313718
 6:12 pm on Apr 17, 2007 (gmt 0)

A single good inbound link

In trying to assertain the value of that inbound and Googles interpetation of purchased links, how did you get this link, was it purchased or 3 way etc?

trakkerguy




msg:3313725
 6:19 pm on Apr 17, 2007 (gmt 0)

Seems we all keep repeating ourselves and never making any progress...

I think the difference between site-wide and page specific is how the external links come in. I've seen a few sites, but the smallest, easiest example had almost all the incoming links to the main page, for the main keyword. If that keyword/main page is hit, all the subpages dependent on it are also, unless very specific terms searched. So in the beginning, it acted like "site-wide" but wasn't really.

The one page with good external backlinks came out of penalty long before the main page and all its dependents.

Too many and too varied of keywords/phrases on a page, and not properly supported by external backlinks is what I see. And if hit, that page doesn't pass on trust to other pages.

[edited by: trakkerguy at 6:33 pm (utc) on April 17, 2007]

trakkerguy




msg:3313728
 6:22 pm on Apr 17, 2007 (gmt 0)

foxtunes - Yes, same date the site I'm working on took a big step downwards. Was wondering if recent paragraph adding a keyphrase back was the problem, but probably not...

trakkerguy




msg:3313745
 6:36 pm on Apr 17, 2007 (gmt 0)

Biggus_D

I really doubt that adding new content would help. In fact, makes it worse if you "expand" the scope.

Decreasing the scope or claims the page makes by using less of the competitive keyphrases, and/or getting more "good links" is what works.

LineOfSight




msg:3313767
 6:55 pm on Apr 17, 2007 (gmt 0)

trakkerguy

Decreasing the scope

Sorry if this is a basic question but when you say decreasing the scope are you saying that a page needs to cover just one subject / keyword/phrase and if you have a page that already does that, reduce the optimisation levels on that page there by taking a more natural approach to the language used as viewed by Google

trakkerguy




msg:3313840
 7:53 pm on Apr 17, 2007 (gmt 0)

Just my opinion of course, but less keyphrases is safer. You might be ok for "blue widget", and "green widgets", and "red widgets.

But add "hairy widgets", "free widgets", "best free widgets", ""online widgets", and you may have problem if your incoming links don't support that you are an authority that should rank for that expanded scope.

The more competive the term, the better your links need to be to validate your site should rank. And if you "claim" to be knowledgable for all the highly searched "#*$!x widget" phrases, you need better links to support that broader claim of authority about widgets in general.

I'm talking about having the terms in the anchor text. Really not sure though if just having the terms on the page cause the problem.

Marcia




msg:3313860
 8:07 pm on Apr 17, 2007 (gmt 0)

The third word added to a two-word phrase that's OK on it's own (with just the two words) and having no problems can be a problem (with the three word phrase) if that third word isn't one that's usually used in connection with the two word phrase.

Marcia




msg:3313870
 8:21 pm on Apr 17, 2007 (gmt 0)

trinorthlighting
This is about in bound/outbound links, downgrading of exchanges and also the no follow that was recently place on wiki pages.

No, again - this 950 thing is not. That's contrary to statements already made, but hey - if you know better than Adam Lasnik about recips, more power to you.

While linking is, and has been for a long time, a factor in other respects, and still is, I don't believe it's got anything to do with this particular 950 phenomenon we're trying to hone in on. There's been no evidence of the 950 drop being related, despite your observation, which has been effectively countered with logic by more than one person.

But if you insist, then explain for us how linking factors, recips and wikipedia links/no follow can affect only specific interior pages and more importantly - certain specific phrases that relate to and appear on specific interior pages - not at all the whole site.

If you insist, tell us how that works. We can tell you how phrase-based indexing can affect individual phrases on individual pages, now you explain for us how links and recips can affect individual phrases on individual pages.

Go for it, we're all ears!

[edited by: Marcia at 8:30 pm (utc) on April 17, 2007]

LineOfSight




msg:3313882
 8:25 pm on Apr 17, 2007 (gmt 0)

Very interesting theory. I too was hit on the 10th and pretty much 100% of the site hit mid -750 to -950 from page 1 positions. Going to try de-optimisation with some quality link building.......

[edited by: LineOfSight at 8:25 pm (utc) on April 17, 2007]

trakkerguy




msg:3313895
 8:35 pm on Apr 17, 2007 (gmt 0)

Marcia - are you saying that if you're ok for "green widgets", then

"big green widgets" might be ok as it is less scope,

but that "green widget bowling" or "download green widgets" might be trouble if they change it to a different theme that your page is not qualified for?

Thats kind of how I see it...

[edited by: trakkerguy at 8:37 pm (utc) on April 17, 2007]

trinorthlighting




msg:3313896
 8:35 pm on Apr 17, 2007 (gmt 0)

Marcia,

Why do you think quite a few people have come out of this with getting some "good" external links.

You keep saying phrased based. What advice are you giving? Tell people to sit and wait?

I say go find a "good" and trust worthy link to your -950 page and bam, you will be out of the penalty.

Trust me on this, a member here had a page and emailed me and I linked to his site from a very trusted site of ours that has always ranked top 10. It was a very relevant topic to our site and I am sure it will add value to our customers. Guess what, his page is no longer -950. As a matter of fact his page is number 12 now. Our page has not budged an inch. It took two days for the page to come out of -950

[edited by: trinorthlighting at 8:59 pm (utc) on April 17, 2007]

trakkerguy




msg:3313898
 8:41 pm on Apr 17, 2007 (gmt 0)

tnl - I totally agree that strong, on topic links can qualify a page to be "authority" on a theme/phrase and pull it out of the -950 and SOLVE the problem.

But you've made statements implying recip links, or bad links of some kind can CAUSE the problem, which is much different.

foxtunes




msg:3313902
 8:42 pm on Apr 17, 2007 (gmt 0)

I know of a few others hit on the 10th, one blazed back today without touching anything in the week it was out.

I see one spammy looking site in the top five with the keyword phrase twice in the title.

Some informational, tightly focused authority sites are the collateral damage of this new algo tweak.

trinorthlighting




msg:3313907
 8:55 pm on Apr 17, 2007 (gmt 0)

Outbound links to other sites can have a negative effect on your own site. Everyone here is focused on inbounds and "phrased based" but it’s the outbound on a recip link that can kill you.

Let’s say, you (Honest Abe) have mom and pop site with link exchanges to "Slim Shady Site” Well "Slim Shady" has just recently started a bunch or doorway pages, is sending out massive amount of email spam, and is doing sneaky redirects. Google flags the site as spam and looks who links to “Slim Shady Site”. Now Google bot has no idea that your site is run by “Honest Abe”. Sure, Google bot can look at the who is and see “Honest Abe” and “Slim Shady” and even the different IP addresses. Well, spammers long ago have figured out that they can register many sites on many IP’s and even under different names. So Google bot really can not tell the difference. So now since “Slim Shady” is spamming and Google bot now flags “Honest Abe’s” website as a possible link scheme. Bam, penalty hits, fully automated and very stealthy.

Look at your outbound links. Look for hidden script linking your site to bad neighborhoods as well. Look for hackers who might have hit your site and dropped some hidden text links.

Right from Google Guidelines here:

In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.

It’s all right there in front of you and all wrote in very plain English. Those mom and pop link exchanges can kill a site, put no follows on them so Google knows you do not endorse the site or are participating in link schemes. Be very careful who you link out to.

LineOfSight




msg:3313914
 9:03 pm on Apr 17, 2007 (gmt 0)

trinorthlighting

is sending out massive amount of email spam

Can this have and effect? Two of my domains that have suffered have both been in email spamming recently - I've received loads of returned email but just ignored it

trinorthlighting




msg:3313916
 9:04 pm on Apr 17, 2007 (gmt 0)

Yes, think about it. Gmail! If your site is sending out massive email spam to Gmail accounts and people are flagging the messages as spam. I am more than sure google looks at that. Another method of determining spam.

tflight




msg:3313918
 9:05 pm on Apr 17, 2007 (gmt 0)

On of the symptoms some people have described are large sections of their site going in and out of this penalty several times over the past several months.

I haven't heard many theories about what causes this penalty which makes sense when you see large sections of site go to the last 50 results, then back to the top 10, then back to the last 50, then back to the top 10.... over and over again.

I too once thought that a few good links to an internal page that was pushed to the bottom of the results could get that page to come back. And I even saw it "work" once. But then a week or so later it was back at the bottom. In the end I believe it was pure coincidence.... when I thought something I had done brought the site back turns out to have been false since within a few days it went back to where it was.

Some people have only been "cycled" down to the bottom once or twice.... others have been up and down with this phenomenon over a dozen times.

LineOfSight




msg:3313930
 9:14 pm on Apr 17, 2007 (gmt 0)

But I'd reckon that pretty much 100% of the time domains are spoofed in email spam - surely even Google can't use that as a factor for determining a domain as 'bad'. It's something we have absolutely no control over.....interesting point though and I don't want to go off topic here :)

tflight




msg:3313931
 9:14 pm on Apr 17, 2007 (gmt 0)

Two of my domains that have suffered have both been in email spamming recently - I've received loads of returned email but just ignored it

If your server has been compromised and they are sending the messages through your server, then maybe (and that is a big maybe) that could be used as some part of a score.

But with it being so ridiculously easy to forge email headers it isn't reasonable to think that a search engine would apply any negative rank to a site based on the "from" address of spam email. It would be just way too easy to abuse and take out their competitors.

This 194 message thread spans 7 pages: < < 194 ( 1 2 [3] 4 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved