homepage Welcome to WebmasterWorld Guest from 54.145.183.169
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 161 message thread spans 6 pages: < < 161 ( 1 2 3 4 [5] 6 > >     
Google's 950 Penalty - Part 11
Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 4:22 am on Jul 23, 2007 (gmt 0)

< continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

Just saw one 950+ and it does my heart good to see it.

User-agent: *
Disallow: /theirlinkpage.htm

No, I'm not saying that's necessarily why, but it would serve them right if it was, for playing dirty like that on a page that's supposed to have reciprocal links with people exchanging fair and square in good faith.
======================================================
Added:

And another 950+, the last site in the pack. Flash only (not even nice) with some stuff in H1 and H2 elements with one outbound link. class="visible"

<style>
.visible{
visibility:hidden;
}
</style>
=========================================================
Another way down at the bottom is an interior site page 302'd to from the homepage, and isn't at all relevant for the search term - it must have IBLs with the anchor text (not worth the time to check).

Yet another must also have anchor text IBLs (also not worth the time checking) and simply isn't near properly optimized for the phrase.

So that's four:

1. Sneaky
2. Spam
3. Sloppy webmastering
4. Substandard SEO

No mysteries in those 4, nothing cryptic or complicated like some of the other 950+ phenomenon, but it's interesting to see that there are "ordinary" reasons for sites/pages to be 950+ that simple "good practices" and easy fixes could take care of.

The question does arise, though, whether the first two are hand penalties or if somethings's been picked up algorithmically on them - in one case unnatural linking, and in the other, CSS spamming.

[edited by: Marcia at 4:46 am (utc) on July 23, 2007]

[edited by: tedster at 9:13 pm (utc) on Feb. 27, 2008]

 

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 6:06 pm on Sep 22, 2007 (gmt 0)

"But you have worded this post to sound like the *exact opposite* of what could be considered good advice for *all*."

As I've said, I couldn't care less about spammers being caught for the reasons google is trying to catch them. I'm not concerned with "all". How they get out of the penalty is ENTIRELY different, which is what you keep focusing on Miamacs.

Likewise how you rewrote what I wrote is 100% wrong for non-spamming pages hit with this penalty.

I can't say this clearly enough but suggesting spammers remove text they are not relevant for may be fine, but I could not care less. With non-spam pages the 950 penalty hits pages they ARE relevant for, that they ARE relevant for in the eyes of any human, that they DO have natural anchor text for internally and from other good sites.

Where Miamacs talks about "get more inbounds, raise relevance..." that's all 100% irrelavant to the non-spam pages. We already got that covered. It's a problem in no way at all. If you have low quality spam pages, you likely need to worry about these things, but for the non-spam pages effected that just bizarrely strange talk.

Non-spam pages hit are generally objectively among the top 10 pages in quality in terms of their multiple relevance areas. Authority non-spam pages about Abe Lincoln and George Washington are at risk if you have quality internal and external linking about about both. A way to make it more likely to get out of the penalty is either split the page in two, or get all your anchor text talking about Lincoln and giving up on Washington.

So if you are a spammer, you might improve your situation by following what Miamacs says, but if you have a well-respected site where pages are recognized as authorities by Google and any human for good reason, then you have completely different issues. Miamacs addresses pages that DESERVE to be penalized. What I am talking about, and the reason these threads began, are pages that literally nobody would consider something that should be penalized... solid pages with solid linking, often having a broader coverage of a topic (thus more synonyms).

There really should be two threads, one for the pages penalized correctly, and one for those penalized by mistake.

(One example of the issue... spam pages often have tons of links from non-relevant sources. The top quality pages also have tons of links from non-relevant sources -- because they have been scraped from search pages for MULTPLE search terms. Obviously sometimes Google will confuse these two phenomenona as the same thing, and sometimes rank the crap page well, while sometimes penalizing the authority page.)

Biggus_D

5+ Year Member



 
Msg#: 3434448 posted 4:30 pm on Sep 26, 2007 (gmt 0)

I get Google Alerts about our sites with articles 1 or 2 years old.

Is this an improvement? or is it not related?

Because those articles are now at the top (Nļ1) of SERPS.

[edited by: Biggus_D at 4:32 pm (utc) on Sep. 26, 2007]

gehrlekrona

5+ Year Member



 
Msg#: 3434448 posted 11:45 pm on Sep 26, 2007 (gmt 0)

Just curious,
Does anyone with a -950 penaly see a lof of these infamous chinese(?) spam sites in the results because I do.
The keywords/phrases I used to rank for are exchange with these spam site and the ones I still rank for have tons of spam.

Just curious if there is a connection, even if GOOG denies it.

Miamacs

5+ Year Member



 
Msg#: 3434448 posted 12:30 am on Sep 27, 2007 (gmt 0)

...

steveb...

Your remarks speak for themselves on just how many (types) of sites you have seen, analyzed and recovered on your own. But you know what...

thanks.

fun to see how all my posts bashing spammers/cr@p sites lead to such conclusions, goes to show how much people actually pay attention to the things I've said. Memory seems to last 3 posts back and that's it, huh? I'll have no regrets stopping reporting whatever I uncover. ( after all I don't want to aid spammers *evil grin* ... they're my competition you know. )

But there's a valid point in there, although it's something that I've already said like 14 posts before, which is: what remains *now* in the -950 area even after all's been done... is mostly cr@p.

You go on telling everyone their pages are OK, Google is wrong.
In my eyes a page that's been hit with co-occurrance filters is much closer to spam, than a legit niche site of which the theme isn't recognized / overbranded itself / has so few good links it's missing key variations in its profile, but hey that's just me. Me, who doesn't like seeing 51+ related money phrases stuffed on the same page, within the same nav... oh wait... your sites are like that? No wonder you *now* have to 'deoptimize' and spread stuff to more pages...

I'm the spammer I should know, ask me.

... well, whatever.

( Geez, it'd be SO easy at this point. But I worry too much about karma and stuff. )

...

...

Haha... thanks again, this really helped to shut my philanthropic self in the back of my mind. We'll be playing video games if you need us.

But watch me post another character in this thread.

tigertom

10+ Year Member



 
Msg#: 3434448 posted 1:18 am on Sep 27, 2007 (gmt 0)

Miamacs: Your posts have been very helpful, and an interesting read.

SteveB: You can wait until Google fine tunes its -950 filter to let (your) (good quality) pages back in, or you can do something about it now.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 1:27 am on Sep 27, 2007 (gmt 0)

"or you can do something about it now."

Um, huh?

If you just figured out you should work on pages that have problems, welcome to Webmasterworld.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 1:34 am on Sep 27, 2007 (gmt 0)

At this point, it might be worth thinking about the use of the terms "synonym" "co-occurrence" and "related phrases." Synonyms and related phrases don't necessarily mean the same thing.

tigertom

10+ Year Member



 
Msg#: 3434448 posted 2:51 am on Sep 27, 2007 (gmt 0)

SteveB: My response was to the general tone of your last post, which is that Google is penalizing pages which aren't spammy, and that this is not right.

I submit that as long as the first few pages of its SERPs are fairly spam-free, it doesn't matter, in practical terms, to Google, that good pages are penalised.

Readers of this thread are looking for ways to get out of this penalty. Miamacs posts are helpful in this regard. Raging against the machine is a waste of energy.

"With non-spam pages the 950 penalty hits pages they ARE relevant for, that they ARE relevant for in the eyes of any human, that they DO have natural anchor text for internally and from other good sites."

I think you're asking too much from the Google algorithm here.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 8:15 am on Sep 27, 2007 (gmt 0)

"Raging against the machine is a waste of energy."

So stop.

You'd be better off focusing on how to get out of this penalty when it is inappropriately applied, if that is your situation. Read the threads and you'll see various advice on how to do so. Raging against the machine is silly, and a waste of space here.

Again, if your situation is google is penalizing you for all the right reasons, then that is a good thing, imo.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 3434448 posted 3:24 pm on Sep 27, 2007 (gmt 0)

Hmm. I have a very small site, not one of main ones by a long shot, that had a page ranking nicely at #8 or #9 for its main search term. At the end of last week, I reversed the order of two words in the title. No other change whatsoever. It promptly went -950. (Good thing it's not a particularly important page!) So this morning I changed it back. It will be interesting to see if it goes back to where it was.

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3434448 posted 4:18 pm on Sep 27, 2007 (gmt 0)

...I reversed the order of two words in the title.

netmeg - Yes, it will be interesting to see if you can undo the effect of your change. Please report back.

gehrlekrona

5+ Year Member



 
Msg#: 3434448 posted 5:46 pm on Sep 30, 2007 (gmt 0)

netmeg,
Seen any changes yet?

I also would like to know if the latest changes have anything to do with the -950 penalty.
The ones that suddenly found themselves in this situation, what type of IBL's do you have? Is it mostly from directories? Other similar sites? Same link/anchor text? Links from blogs and/or review sites?
I am starting to lean against it has something to do with link structure and that some links have been demoted while other links have gained more weight. In the Google serp changes thread we are discussing links so I am not sure this is the right place to write this, but I am curious to see if there is any connection between -950 and linkage.
I'd also like to find out if -950 is a problem with GOOG now trying to implement local search results? Does your site have location anywhere in tags/content? From searches I have done, it seems that GOOG puts more weight on a location than anything else.

HoHum

5+ Year Member



 
Msg#: 3434448 posted 10:59 am on Oct 1, 2007 (gmt 0)

For one search term we are number 1 and -950'd whereas previously we would be 1+2.

The 950'd page is older, has a similar keyword density (mostly mentioned in a table of values). It has 22 IBL's from forums and trade websites about the topic (au natural). Internally the page is linked from a couple of PR 4's and a few cached searches/mini 'sitemaps'.

The number 1 page has the phrase as anchor text (in a table of values) to internal pages that the user would find relevant to specifics on the search. It has 9 IBL'S from mostly forum discussions (genuine) on topic. This page is linked from our PR 5 home page, a PR 4 and a load (45 total) of pages that have been 404'd for a few weeks or so (we toyed with a related articles bit on bottom of page but it looked spammy or could do and hence these dead pages used to link to this page).

Maybe G views one of the pages (the older) as surplus to requirements? To the human eye both pages are relevant with the older more specific to the search term (in that it actually answers the query, whereas the other points at other pages that do). Maybe its all nonsense and I should do some proper work?!

Anyone else see this and if so are there any obvious differences between the good, the bad or the ugly...

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 3434448 posted 3:02 pm on Oct 1, 2007 (gmt 0)

netmeg,
Seen any changes yet?

Yep; back to normal today. #9 for one term, and #21 for the other.

bwnbwn

WebmasterWorld Senior Member bwnbwn us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3434448 posted 8:07 pm on Oct 4, 2007 (gmt 0)

This has been sort of silent since the serp changes have really caused problems for many sites.

Wanted to upadte on my progress as to this 950 monster. The changes I have made looks like has lifted the 950 off my site and I am slowly moving through the pages 93-92-91-90-87-86 etc till I am on 77 now, but evey time my home page is indexed it moves up a page or two.

I have been watching 950 other sites and it looks like Google is running them all through the process like a line and if you site gets put there it goes through a checking process and is released to one shoot back up as some have done or as mine is being allowed to slowly come back.

I will post my changes after I see the progress continues to move forward no sense and adding to an already confussing mess till I am sure the chages I made really did lift this filter.

trakkerguy

5+ Year Member



 
Msg#: 3434448 posted 8:25 pm on Oct 4, 2007 (gmt 0)

Some sites have been recovering. I've been fortunate on a couple that have made 100% recovery and are stronger than ever.

But, I think another reason it has been quiet on this thread is the symptoms have evolved.

A lot of sites recently hit are seeing pages move from page one to rank 200 or greater, but not always the end of serps, or 950. And some previously afflicted sites are seeing partial recovery with pages climbing up from -950, but still stuck several hundred positions down.

There may be just as many sites hit, but don't realize it is the same as the -950 problem, since the name is even less descriptive than it was before.

bwnbwn

WebmasterWorld Senior Member bwnbwn us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3434448 posted 9:39 pm on Oct 4, 2007 (gmt 0)

trakkerguy congratulations what would you contribute you recover from did you do changes in stages or made a mass change or some simple change?

I would attribute the moving or flux to different datacenters with more tighter filters. I haven't seen but a positive move and as you see it isn't rapid by any means.

I would suspect the ones that recovered real fast are the ones moving back and forth as there is still something they haven't found but of course what do I know I am still sitting a 731...

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 11:31 pm on Oct 4, 2007 (gmt 0)

The name is descriptive because dropping 200 spots is an entirely different phenomenon.

Unfortunately some comments in these threads confuse the issue because people are talking about that -hundreds kneecapping kind of penalty that will often cause a page to rank decently (say in the teens) for some terms but drop it hundreds of spots for others. This isn't a 950 penalty. The 950 penalty is one specific thing. It is not the be all and end all of every penalty.

trakkerguy

5+ Year Member



 
Msg#: 3434448 posted 11:47 pm on Oct 4, 2007 (gmt 0)

dropping 200 spots is an entirely different phenomenon

Maybe. Maybe not. How do you know? The site I've worked on most that recovered, kept fluctuating between -950 and 2-400 before it popped out at #11 for single keyword. At that point, it still had several 2 word searches that were at 2-400, and some at -950.

Now, 2 months after recovery, site is #2 for single word search, and #1-3 for most others. But there are a few searches that are still 950, and a few that are 200-400.

The 950 and the "several hundred" drop may not be the same thing, but sure seem similar, and sometimes affect the same exact pages.

bwnbwn - I attribute the recovery to addition of good strong backlinks, deeplinks, and much new, varied, content. Not keyword dense.

And dumping or rewriting pages that were heavily scraped may have helped. But hard to know for sure. I do know that those changes made the site rank much higher than before.

[edited by: trakkerguy at 12:05 am (utc) on Oct. 5, 2007]

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 12:01 am on Oct 5, 2007 (gmt 0)

...fluctuating between -950 and 2-400 before it popped out at #11 for single keyword.

The phrase-based indexing patents do describe one approach that could account for the total picture you're describing with one mechanism. Doesn't mean that this is what's happening - but in theory, the math is there for it to be possible.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 1:30 am on Oct 5, 2007 (gmt 0)

"How do you know?"

How can I tell a watermelon from a yak?

The 950 penalty is a specific thing, where pages are placed at an exact place in the search results. Anything else is something else.

Besides that, the kneecapping penalty has been around a long time, and the basic soultion to it (unlike the 950 penalty) is simple: make changes to the page every couple days, and wait for it to be crawled a half dozen times. This commonly fixes the kneecapping penalty, but seldom does much of anything for a 950 penalty.

If you merely drop scores of spots in the results, you don't have an "end of the results" penalty, period. The dumb word "sandbox" eventually got co-opted by people who wanted it to mean any type of "ranking problems". It would nice to keep the end of the results penalty thread focused on the end of the results penalty.

==
"The 950 and the "several hundred" drop may not be the same thing, but sure seem similar, and sometimes affect the same exact pages."

I've never seen them effect the same pages (although it surely is common for a page to have two different penalties applied simulataneously), but of course the penalties have similarities, as any kind of penalized pages should tend to.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 2:04 am on Oct 5, 2007 (gmt 0)

>>dropping 200 spots is an entirely different phenomenon

>>Maybe. Maybe not. How do you know?

It's pure logic that they're different. If they were identical, there would be an identical end result, not different results. It's simple programming logic:

if condition=A/then/do this
else
if condition=B/then/do that
else
if condition=A AND condition=B/then/do something else

The site I've worked on most that recovered, kept fluctuating between -950 and 2-400 before it popped out at #11 for single keyword. At that point, it still had several 2 word searches that were at 2-400, and some at -950.

One word search terms aren't the same thing as multi-word phrases. And aside from there being a difference between one-word and multi-word searches, if phrase-based posting lists are used, couldn't the difference and "bouncing" be attributed to changes and updates in the data sets used for query-time filtering?

defanjos

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3434448 posted 2:42 am on Oct 5, 2007 (gmt 0)

To be a 950 penalty you have to be in the last page or last few pages of results, correct? So, if there are 200 sites that have the 950 penalty for a certain term, how does G fit them all in the last pages? I would think a site 10-20 pages from the end could still have a 950 penalty, right?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 3:35 am on Oct 5, 2007 (gmt 0)

The penalty mechanism I'm talking about is a re-ranking of the preliminary SERP. Here's how I read the patents.

A preliminary list of results is generated, and each url in the list is associated with a relevance factor. This relevance factor is not a 1 to 1,000 ranking number, but a relevance factor calculated out to a decent number of decimal places. Also, this original list can be truncated at some pre-set relevance level - giving us some searches with fewer than 1,000 available results, even though the total number of pages reported might be in the millions.

At this preliminary point, some "potential manipulation" factor is then tested for - could be on-page, off-page or off-site. The url is measured against Google's statistical norm for that specific factor. If the url deviates significantly from that pre-measured norm, then its original relevance factor can be recalculated - treated in one of two ways:

1) subtract a given amount from the original relevance factor, or
2) multiply the original relevance factor by a number less than 1.

Every url in the preliminary list can be treated in this way, and the final list is then sorted by the newly calculated relevance factors, some adjusted and others not. No new urls are introduced to or removed from the original list of urls. Instead those original urls are re-shuffled.

Now take this process one step further. Instead of using a set, fixed number for the multiplier or the subtractor, you can graduate that number according to how FAR the url falls outside a standard deviation for the trigger condition. A mechanism like this could then account for a given url ranking at the end of results, or in the botttom-middle, or page 6, or almost anywhere.

Even further, if the relevance factors for the original, preliminary list are very tightly packed (as is more likely in a highly competitive search), then even a small mulitplier or subtractor could have a dramatic effect and send a penalized url far down the list. But if the original relevance numbers for the list are not so tightly packed, then the visible reshuffling might be less.

This "dial" can then be tweaked in several different ways:

1. push the allowable deviation to be tighter or looser
2. tweak the multiplier/divider up or down
3. test more than one condition in this fashion at the same time and combine the mulitpliers/dividers

In this way, just one process can account for a lot of variation. This is math that I gleaned from the patents, not math I created from my own purely theoretical musings. The patents also talk about pre-calculating a lot of this and having those numbers cached and at-the-ready.

Here's another reason I consider this kind of mechanism may actually be in play. In a given market, the pre-measured norms will be different. Because this is so, it will seem like one website can "get away with" practices in a given market that would hurt another website very badly in another market.

If such deviations are measured for semantic factors, such as term co-occurence, then we also see that a url can be penalized for one search but not penalized on another, related search. Again, this kind of math could generate exactly those signs.

But the essential key is still this - what condition or conditions are being tested against Google's statistically measured norms?

If you don't have that critical information, then knowing the mathematical mechanism itself doesn't help much - even if it's an elegant structure that allows all kinds of "penalties" to be calculated and applied simultaneously at run-time, with very low computational overhead.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 4:09 am on Oct 5, 2007 (gmt 0)

I appreciate that my last post may not be everyone's cuppa tea - I've done some graduate studies in math, so I appreciate that my mind can get very weird for some people.

If you've got a way to make sense of this current situation and your understanding helps you get things fixed things, then that's what you need. The reason I posted what I did was to show that there are ways for one mathematical system to generate a wide variety of effects. Given Google's tendencies towred mathematical elegance, I think that my ideas at least approach what they do.

But I don't have any secret pipeline to Mountain View - I'm just working, like we all are, to find out what works.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 4:21 am on Oct 5, 2007 (gmt 0)

To be a 950 penalty you have to be in the last page or last few pages of results, correct?

My guess would be "no." Theoretically, rather than being specifically relegated to the last page, you'd have to be at the very end of the partition, based on the scoring values used and stored for that particular partition in the posting list used.

So, if there are 200 sites that have the 950 penalty for a certain term, how does G fit them all in the last pages? I would think a site 10-20 pages from the end could still have a 950 penalty, right

Again (theoretically), I'd guess that would depend on the size of the particular partition.

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3434448 posted 5:07 am on Oct 5, 2007 (gmt 0)

tedster - Superb post. Here's a thought, combining several of the points that you make...

...original relevance factor can be recalculated - treated in one of two ways:
...
2) multiply the original relevance factor by a number less than 1.

But if the original relevance numbers for the list are not so tightly packed, then the visible reshuffling might be less.

This "dial" can then be tweaked in several different ways:
...
3. test more than one condition in this fashion at the same time and combine the mulitpliers/dividers

Taking the above two points together, if you multiply several numbers that are less than one, you can get a dramatic effect on a not-so-tightly packed list, even if no single factor is dramatically low.

Eg...

.8 x .8 x .8 = .5120

.9 x .7 x .8 = .5040

.7 x .7 x .7 = .3430

etc.

On a tightly packed list, as you suggest, you may not even need factors this low. This is how I've felt that a page that is weak in several areas, like linking and semantic factors for a specific phrase, might well plummet, while the page could do well for other less competitive phrases.

trakkerguy

5+ Year Member



 
Msg#: 3434448 posted 5:01 pm on Oct 5, 2007 (gmt 0)

Yes, very nice description Tedster. I know you've posted something like this theory before in less detail. I've kept that in mind the last few months, and seems to hold up with the example I've been working with.

Once the index page popped back up for single word search, it has not fluctuated, and has gradually climbed up to #2.

But different two word searches have gone from top 5, to several hundred back, to 950. Back and forth. Usually with no on page changes made.

Could be different penalties, but could be a different "multiplier" factor applied that causes a page to drop to different places.

Wouldn't it make sense for google to try and modify the penalty (when possible) so pages aren't pushed to 950? It is harder to identify the possible cause when the symptom is less clearly defined.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3434448 posted 6:07 pm on Oct 5, 2007 (gmt 0)

The longer I work with this model, the more I notice about Google Search results that it could account for - and this extends to many areas outside the -950 penalty.

Whatever Google is doing with the -950 penalty, it is certainly being fine-tuned all the time. They don't want "collateral damage" any more than the websmaster's do.

randle

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3434448 posted 7:09 pm on Oct 5, 2007 (gmt 0)

Great post Tedster.

Whatever Google is doing with the -950 penalty, it is certainly being fine-tuned all the time.

Thatís for sure. Our experience has been (and for others maybe itís different) that the vast majority of sites hit with this thing did nothing to bring it on, and the vast majority of sites that got out of it, did nothing on that accord either.

They don't want "collateral damage" any more than the websmaster's do.

Makes you wonder what this thing is accomplishing. If thereís collateral damage, then theoretically thereís some good being accomplished. Personally I have not seen any improvement in the SERPís since it appeared, but thereís got to be some reason for it.

Makes me long for the days when Google wasnít so obsessed with filters and penalties, but I guess its all part of the price of progress.

europeforvisitors



 
Msg#: 3434448 posted 7:25 pm on Oct 5, 2007 (gmt 0)

Interesting reading, Tedster, even for those of us who haven't studied math (or thought much about it) since high school.

One question: You said, "In a given market, the pre-measured norms will be different. Because this is so, it will seem like one website can 'get away with' practices in a given market that would hurt another website very badly in another market." In the context of that statement, how broadly or narrowly would you define (or do you think Google might define) a "market"? Are you thinking of very broad categories like "widgets" and "bookings" and "real estate" or narrowly-defined niches?

This 161 message thread spans 6 pages: < < 161 ( 1 2 3 4 [5] 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved