Forum Moderators: Robert Charlton & goodroi
Just saw one 950+ and it does my heart good to see it.
User-agent: *
Disallow: /theirlinkpage.htm
And another 950+, the last site in the pack. Flash only (not even nice) with some stuff in H1 and H2 elements with one outbound link. class="visible"
<style>
.visible{
visibility:hidden;
}
</style>
=========================================================
Another way down at the bottom is an interior site page 302'd to from the homepage, and isn't at all relevant for the search term - it must have IBLs with the anchor text (not worth the time to check).
Yet another must also have anchor text IBLs (also not worth the time checking) and simply isn't near properly optimized for the phrase.
So that's four:
1. Sneaky
2. Spam
3. Sloppy webmastering
4. Substandard SEO
No mysteries in those 4, nothing cryptic or complicated like some of the other 950+ phenomenon, but it's interesting to see that there are "ordinary" reasons for sites/pages to be 950+ that simple "good practices" and easy fixes could take care of.
The question does arise, though, whether the first two are hand penalties or if somethings's been picked up algorithmically on them - in one case unnatural linking, and in the other, CSS spamming.
[edited by: Marcia at 4:46 am (utc) on July 23, 2007]
[edited by: tedster at 9:13 pm (utc) on Feb. 27, 2008]
One question: You said, "In a given market, the pre-measured norms will be different. Because this is so, it will seem like one website can 'get away with' practices in a given market that would hurt another website very badly in another market." In the context of that statement, how broadly or narrowly would you define (or do you think Google might define) a "market"? Are you thinking of very broad categories like "widgets" and "bookings" and "real estate" or narrowly-defined niches?
My main point is that we might apparently see many different algorithms (as many people have already assumed) when there's really, essentially, just one algo that naturally adapts to varying practices. It would also give results that naturally balked at being reverse engineered because the tolerable variances would vary widely according to the search term.
Tip-toeing 'round Google's algo wastes valuable development time; it's getting to be annoying.
Amusing: I read one chap on another forum saying "there's no such thing as a -950 penalty!". To which another poster replied: "That's true, until you get hit by it".
i.e. Your site is old and has so many great backlinks you can shrug any penalty off _or_ you're not targetting popular keywords with optimised sites.
[ Just an opinion -- no, I am not a spammer -- etc. etc. ]
p/g
[< continued here: [webmasterworld.com...] >
[edited by: tedster at 1:05 pm (utc) on Oct. 31, 2007]