Forum Moderators: Robert Charlton & goodroi
I have two urls that were 950'd on Friday. Unfortunately they are my two most requested urls (other than the home page) - they'd been #1 in the SERPS for at least three years, and now only show up on the last page for the most common search phrase. They both showed PR previously, but TBPR has been greyed out since the last update. However, in certain permutations of the search phrase, they they still rank #1. The search string usually comprises the city name and the event, and often includes the year.
Example:
city event - 950'd
city event 2007 - 950'd
event city - #1
city state event - #1
city state event 2007 - #1
As far as I can tell, it is ONLY two urls, out of around 500, that fell into this (so far, anyway)
What it all means, I have no idea.
[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]
...
Sites on which I redesigned the nav anchors to be more reasonable and relevant, and/or got links with a variety of relevant phrases, and/or some additional variety in word order, and/or solved any other issues of dupe content, and/or 404s, and/or out of balance pagerank, and/or removed pages targeting market sensitive stuff that the site never had a single IBL for... are all out of the -950 area.
Problems can be different, solutions can be different, result is the same. Get the site in shape. Get some links.
All that said, I feel for those whose problems aren't that easy to spot.
So, I've cleaned up some of the on-page seo, removed double linking on a specific page (wasn't seo related tho, which makes me mad), added no-follow to links going to utility pages, changed my index.php back into an index.html (long story behind that one), and then issued a re-inclusion request a couple of days ago.
I'll just have to wait and see.....
Sites on which I redesigned the nav anchors to be more reasonable and relevant, and/or got links with a variety of relevant phrases, and/or some additional variety in word order, and/or solved any other issues of dupe content, and/or 404s, and/or out of balance pagerank, and/or removed pages targeting market sensitive stuff that the site never had a single IBL for...
Question: What is Google trying to accomplish with the 950 penalty?
It's a hack to address their own internal problem, quality vs. quantity. Their previous calculation of authority links allowed to much leeway. This is how they address it.
Sites that have the trust, but have certain other problems are now not "pardoned" by default. Sites with only a couple of very good links that target major keywords and phrases are sent to the back until they gain some variety and a larger mass of votes from anchor text. Sites with faulty HTML, sites with spammy SEO practices... list goes on.
Basically, trusted sites with problems.
The hard part is to spot those problems.
I've only listed the ones I had encountered in the past.
That list is not complete, no list could be as there's an endless number of things that can alert the quality, accessibility and spam filters.
Technically speaking, this is a filter.
A script combs through the results, and sends sites to the back, but the ranking stays the same in the main database. Turn off the filters ( &filter=0 ) and see for yourself, sometimes the page is not -950.
In practice however, this is a penalty. However it can be lifted by addressing the problems. With every recalculation of these triggers, pages pop in and out. Either because the page, the overall dataset they compare it to, or the thresholds themselves have changed.
As far as it being manual, I haven't seen a single case in which I found even a hint of it not being completely automated.
edit reason: tired. I messed up the comment on the &filter=0 stuff
[edited by: Miamacs at 5:01 pm (utc) on July 20, 2007]
As far as it being manual, I haven't seen a single case in which I found even a hint of it not being completely automated.
Absolutely agree Miamacs. I've seen some that seem "stuck" at 950 for now reason, but eventually recover somewhat. Seems there is a big delay built in for some things, but is not manual penalty.
A script combs through the results, and sends sites to the back, but the ranking stays the same in the main database
Right. you can see in webmastertools where google says you rank high for a term, but in the search results you are end of serps (or now sometimes middle)
It's a hack to address their own internal problem...
What I'd like to know is why only certain pages get hit - namely a Home page with no apparent problems.
I just don't get it!
There are sites out there getting away with murder while others get penalized for nothing.
It seems to me Google needs to turn some knobs and get it right.
[edited by: bobsc at 12:37 am (utc) on July 21, 2007]
- Deoptimise drastically, then _slowly_ wind it back.
- Get more backlinks to compensate
- Develop more domains with different templates.
Putting all your hopes on one heavily SEO'd domain is, and probably always was, a mistake. Your site looks 'unnatural'.
I have a meta search engine script on one of my sites. It queries major and minor SE's. The amount of crap some throw up is informative.
The -950 filter is an anti-SEO, anti-spam one. Business as usual, in other words.
Similarly the -30 thing is a clear penalty. Nothing is being removed, things are being moved to an approximate specific place.
Turn off the filters ( &filter=0 ) and see for yourself, sometimes the page is not -950.
We have been suffering for a while now. Decided to do the &filter=0 and we stay pretty much the same. Last page almost mid every time. So in our case it seems more like a penalty before the filters. Now this happens with optimized terms and even obscure phrases taken from the body of a page.
Tedster may be right, but I'd like to get a hold of this. While there's a lot of reasons to be sent to the back of the SERPs, I just don't think that they have more than one script for this... and at different levels of the ranking calculation. It could be, but my gut feelings tell me that the real answer is somewhere else.
I looked up all the examples ( those which remained -950 since my last post ) and turning off the filter brings back the pages at their regular position.
The strange thing is that querying the database directly reports these ( non-penalized ) results for only half of them. Not all.
But in response to the comments on the &filter=0 parameter, it works for the sites / with the problems I work on so I can't put this into perspective... I see it on all of them, for all phrases. Of course, as I've mentioned, there are only about five to ten reasons and their combos which I encountered so far, so it's not unlikely that they do not apply for everyone.
Of course you do remember that the &filter=0 parameter also turns off clustering.
If there's a site with 600 relevant URLs above your original ( non penalized ) rank, you'll be pushed down 600 positions by the idented "additional" results. The one you'd see when pushing the "dupe content" link... you know, repeat the search with the omitted results included
edit: ran a spell check with the other side of my brain. w/ not much success.
[edited by: Miamacs at 11:18 am (utc) on July 21, 2007]
- Too many keywords in internal links
- Too many identical anchor text phrases in external backlinks (appearing too quickly)
I regard my penalty as a 'wake up call' from relying too much on 'old school' SEO for traffic.
I'm guessing it's too much of that + poor quality backlinks are the cause. Poor quality being: all low PR + identical anchor text.
What does a cr*p site look like?
- Popular template
- Larded up with keyword links
- Competitive niche
- 3000 identical backlinks from 'directories'
- Few links from high quality, related, sites.
I guess Google, having access to loads of data, + human reviewers, can spot a standard cr*ppy site easily. It tweaks its algorithm, some sites fall out, other cr*p sites fall back in, but if the next effect is cleaner SERPs, job done.
If this is an automatic filter, I can see that once the offending material has been removed/backlinks changed you should then lift yourself out of the filter and bounce back. Unless there is some kind of time penalty. But at least then eventually you can come back.
If it is a manual filter then it will be more difficult. It could take months for someone to get around to reviewing your site. Maybe never.
With so many people effected, it's hard to beleive it is manual, unless they have employed 10,000 people like another thread here suggests to manually filter sites. But why on earth would a company who has success from an algorithm change to humans instead of just tweaking the algorithm?
Still think the scary thing is how easy it would be to knock a rival to the end of the SERP's.
The main cause is over-optimisation:
- Making a page too 'sweet' for a keyword, and
- Using straight keywords in internal links, and
- Having lots of keyworded internal links to other internal pages, on each page.
Scale that back and get more (quality) backlinks (to internal pages). Getting quality backlinks is a good use of your time anyway.
Algorithm: IF a page is larded up with keywords [including 50 keyword-heavy links to other internal pages] AND it's got no good external backlinks AND it's a competitive phrase THEN -950
If Google didn't do this its results would be like MSN's.
In my niche it's a good mix of major leaguers plus cheeky upstarts like me. Not too much rubbish.
With so many people effected, it's hard to beleive it is manual, unless they have employed 10,000 people like another thread here suggests to manually filter sites. But why on earth would a company who has success from an algorithm change to humans instead of just tweaking the algorithm?
It hasn't, as you'll see if you read that thread in more detail.
The main cause is over-optimisation:- Making a page too 'sweet' for a keyword, and
- Using straight keywords in internal links, and
- Having lots of keyworded internal links to other internal pages, on each page.Scale that back and get more (quality) backlinks (to internal pages).
If Google didn't do this its results would be like MSN's.