Welcome to WebmasterWorld Guest from 126.96.36.199
I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.
I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.
I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.
The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.
Keyword1 ¶ Keyword2 ¶ Keyword3 ¶ . . . ¶ Keyword9
But for each of the directories, i.e:
there is still repetition of the horizontal header nav link in the vertical menu:
Keyword1 ¶ Keyword2 ¶ Keyword3 ¶ . . . ¶ Keyword9
I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"
Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!
That's just bad site structuring.
I HATE THIS 950 POS!
I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.
"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster
"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster
So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.
Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...
Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?
Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.
That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."
I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.
Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?
"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g
[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]
I have keywords/descriptions but they're not what I would consider spammy. The description is identical to the product description located on the pages, I would think changing it to something different would be more spammy, but maybe I'm wrong.
I disagree. If you're spam-detection code, you're looking for signs of auto-generated pages. The more elements that are the same, the more it looks auto-generated.
I'm not saying a description tag that matches a product description is always going to be a problem, but in the context of other spam-like elements, it could add up and trip the penalty.
On a different 950 issue...
Has anyone here successfully moved pages on a 950d site onto other sites? I've got a section of pages on one 950d site that fit well on another site that isn't 950d. Currently, because the one site is 950d, those pages are not found, so they're not helping anyone.
What do you think? If nothing else, it could be an interesting and practical experiment. Will the moved pages survive the 950 penalty, or will they 950 the second site?
The phrase-based/950 algo seems to consider the context (site), as well as the page. It also appears to consider Inbound Links. So if my second site has better IBLs, it might be alright.
Originally I'd thought of moving all the pages onto a new site, but I'm thinking of just moving a section instead. Less dramatic, maybe.
IBLs is something I'm starting to focus on. I did a test page where I added an extremely low level IBL on an obscure page with the two-word combo in the anchor text to a page and it jumped from being -950'd to somewhere in the 70s... not a "fix" but an improvement if you will. I'm working on some way to start adding more legitimate and better ranking IBLs to see if I can solve this problem once and for all.
Yahoo! I just realized one of my sites that was 950d last October has recovered for the first time and, as I speak, sits happily at SERP #3 for the "competitive phrase" (only one that ever previously ranked--it was earlier #1). Google Analytics indicate the breakthrough came a few days ago.
Again, as with another site that had its 950 penalty lifted, any or all of the following changes may have done it.
1. Put a "Home" link to the home page at the top of the navigation menu. Previously the anchor text was the competitive/target phrase ("Red Widgets"), which was a footer on every page of the site (following "Copyright").
2. Changed the site title (home page) to "Red Widgets." It had been about five words, which included "red" and "widgets."
3. Made the H1 of each page unique (it had been the site title "----- Red ----- Widgets").
4. Removed all meta tags (Description and Keywords). They had been the same as on-page text.
5. Removed most text of each image alt-tag. (It used to match page title.)
6. Removed alt-tags from navigation thumbnails.
7. Added four paragraphs of introductory text to the home page. (It used to be almost nothing, just navigation links plus H1 title and H2 subtitle.)
8. Removed footer links to it from other sites which were 950d. (Did that months ago, but saw no immediate deliverance from 950 hell.)
The domain name of the site is red widgets dot com. (I always thought it was weird of Google to 950 a site for the search phrase Red Widgets when the domain name was red widgets dot com!)
Another site (that I don't like) fairly recently added links to just about every page on my site. I don't know if that counted as valuable In Bound Links, which some have suggested can get a site freed from the 950. (I wouldn't consider this one an authority site, so I doubt it helped much.)
The changes were made based on comments in this forum. I remember somebody suggesting adding more content can help; so can removing footer links. Another idea somebody offered was "Home" instead of the site title as the main link to the home page. I deduced, correctly or incorrectly, matching meta tags could look like auto-generated spam.
I don't know if this site will remain free from the 950, but it is the first time it's been lifted, so I suspect some of the above changes may have got it this far.
Best thing to keep in mind is this...these penalties are not manually handed out...if your site receives a penalty its because you triggered something for google to give you a penalty.
I think about three of my sites all got 950d around the same time (last October), including that one. The attempt to fix it was probably about a month ago.
> Many people overlook the meta keywords and descriptions. I do believe google factors those in when deciding to give out a penalty.
Yeah, probably. The potential harm of tags far outweighs the potential benefit, IMO. I don't use Keywords anymore. I only use Descriptions if they were carefully written. If the page length is short, I prefer to omit them. I believe when your page/s are very short or low on content, Google is looking for any excuse to zap your page/site.
Around the 10th of March i really went in and adjusted (over optimized)the keywords on my page and changed all the titles to be keyword heavy.
i also went and did a lot of inline linking within my site(which is only 6-7 pages) with various keywords which were in my h1 and title.
I also added a footer with some small text and a few links.
it seemed as though within a day i noticed pages starting to disappear from the SERPS and within 2 days i was 950'ed for a majority of my keywords.
like most here if i type in widgets im -950 but if i type in red widgets im #1. if i type in blue widgets im -950 - type in red blue widgets #1
**i also want to note that about 7-10 days ago my site was scraped with a pretty big portion of my index page...that site now ranks exactly where my site ranked for a majority of my keywords.
I went in and changed my page pretty much back to the way i had it before OOP and changed some of the content on the index page
I'll up date if there's any improvement
What was up is now down and down is up. LOL No more Keyword & Description Meta Tags... No more out-bound links... no more this... no more that...
I just completely deleted hundreds of pages of content that enjoyed very high industry-related rankings sine 1997... But completely tanked this past week or so when most all were removed from Google's search engine. How is it possible to going from "most revered" to most "reviled" in a week after years of stardom?
Anyway... is it worth starting completely anew with one of those "new-fangled" CMS scripts .. or just chuck-it completely. This behavior convinces me that one can't depend strictly on search engine traffic, as I have been fortunate to have done up until now....
But as an old-timer, I refuse to pay for web traffic. At best, I might consider buying advertisement in the local newspaper... they could use the ad revenue and supply my site with well targeted visitor traffic....
The dream is over...
In regards to the rest of it, yes, things have certainly changed since the 90s. But not everyone here is recommending abandoning all meta tags. I agree that meta descriptions are very important and help get a url out of supplemental results. Not so sure there's any relation between the description meta and the -950 penalty however.
Google reps have been pretty clear that the meta description is not directly involved in the ranking algorithm these days. It is however used for snippet creation and the kind of "quick indexing" that can consign a url to suppolemental even with decent PR.
[edited by: tedster at 3:38 am (utc) on Mar. 16, 2008]
That Optimization Report is the most worthless initiative Google has, in my opinion... If it only DID provide usable information,,, it could be a life-saver...
I went in and changed my page pretty much back to the way I had it before OOP and changed some of the content on the index page. I'll update if there's any improvement.
Good luck. When I first read your post, I thought maybe you were doing a 950 experiment, testing a website you didn't mind losing.
I hope you get everything back. I just don't know yet after reading many pages here if it's as easy as Revert to Previous Version. I suspect the bar is higher to recover from a 950 than it is to get the penalty in the first place. But the 950 is supposedly based on an algo, so you may have instant success (after the next cache update).
But not everyone here is recommending abandoning all meta tags. I agree that meta descriptions are very important and help get a url out of supplemental results. Not so sure there's any relation between the description meta and the -950 penalty however.
If you have good page rank, great IBLs, and no sign of 950 penalties, I wouldn't suggest removing anything, including Descriptions. The Description tags, in fact, can be an important part in the front end of ecommerce, for those who use them for Advertising.
But for everyone who is dealing with 950 penalties, and keyword-stuffed their meta tags, or used a bunch of phrases, I recommend consideration of how much value you're getting for the tags. Google has said the 950 is an overoptimization penalty, and didn't say exactly which elements of optimization contribute to the 950. Therefore you have to consider everything, including tags.
Why do you have meta tags? What are you getting out of them?
I have some tags which were copied from the titles. Just copied and pasted. That was lazy web design. I put them in pages hoping to get some kind of SEO value. But I don't know if they helped. So I removed them.
I have other pages with tags that are unique and carefully written. They don't look particularly spammy. I'm keeping these for now--especially the most visited pages and landing pages (whose Descriptions are most likely to be read). Many pages that aren't landing pages really never have their Descriptions read, do they?
Some webmasters, though, obviously have to figure out which is more important, recovering from 950 Hell or avoiding Supplemental Hell.
If each element is considered, then the issue becomes which elements are considered more significant, and by how much. The algo is most likely cumulative and fractionally weighted. It doesn't for example, consider keyword-stuffing as horrendous as using bold text.
Matt Cutts says the 950 is for "Over" Optimization, not Optimization. That means there is a limit to how much Optimization you can do on your website, a line that must not be crossed.
Let's get out of the abstract for a minute and consider a mathematical approach to understanding the 950 penalty. Because many people here consider the 950 to be based on an algorithm, not a human penalty, it's all about math. So what does that 950 algo look like?
If X1 + X2 + X3 + X4 + X5 - Y > Z Then Impose 950 Penalty
Algo checks each method webmasters historically use for Search Engine Optimization, including, but not limited to, the following:
X1 Repetition of Keywords in Anchor Text
X2 Repetition of Competitive Keywords in Anchor Text
X3 Footer Links
X4 Number of Exact Matches of Meta Tags in Tags and On Page
X5 Number of Pages with Less Than 10 Words (Content Distinct from Links and Tagged Text)
Y Number of High Quality Back Links
Some of the factors appear to be more significant than others. The weighting of the 950 algo could look like this:
If 100X1 + 200X2 + 50X3 + 10X4 + 500X5 - Y > Z Then Impose 950 Penalty.
What other elements would you add to the algo, and how would you weight them?
You wouldn't. That isn't how it works. The history of this penalty is one where two pages can have the exact same linking and page construction style, but the one about "Poison Phrase" gets a 950 while the one about "Sweet No Problem Phrase" isn't hurt in any way. There is definitelt not a generic equation that can be applied equally to all pages.
It is however used for snippet creation and the kind of "quick indexing" that can consign a url to suppolemental even with decent PR.
There is definitely not a generic equation that can be applied equally to all pages.
Absolutely right. If you can manage to get through the patents, you may notice that the process of phrase calculations is mathematically quite savvy. It automatically adjusts the penalty thresholds according to common practices in the "sector" of the web being analyzed. You may have noticed, in practice, that you can get away with extremes in one type of market that would quickly get you nailed in another.
Also, don't forget the presence of related (or naturally co-occurring) phrases. The problem may not be the actual keywords that seem to be poison, but the fact that either too few or too many related phrases also are involved with the page. This is the core of phrase-based analysis. Over-the-top attempts at SEO often make one of these two errors:
1. focus too much on using just the exact target phrase
2. throw in every related set of words and concepts that can be discovered
The position remained -950 for both sites. Now it's clear for me reconsideration request won't help with -950.
Took about two days after making the changes i said i made in my last post.
Also important to note -- one of the scrapers that replaced me in all the positions in the SERPS after i went to -950 changed the content that he stole from my page today and that seem to bounce him out and put me back to where i was.
"According to an [sic] Google engineer I spoke with ... Everything on the page can have a negative effect on your ranking, even metatags you create yourself. I was thinking of testing this, but I never got around to doing it."
This is a paraphrase, for all intents and purposes, of the Google Spam patent Tedster posted previously. The interesting thing is we can tend to think of meta tags as innocuous. If Google doesn't take them into account to boost the page's rank, surely it can't do the opposite, i.e., use the tags for lower ranking or even penalties? But that's not a logical argument.
Why? Well, for one, webmasters choose tags for optimization, not just for the hell of it, and Google's engineer (Cutts) has already confirmed, as we noted before, the 950 Penalty is for overoptimization.
It's not being suggested that tags are more important or weighted more heavily than other known 950 issues (anchor text stuffing, etc.). I'm just trying to burst the bubble of anyone who has a false sense of security about their tags.
Incidentally, I think there's a myth that's been perpetuated about Google not weighting meta tag Descriptions, and one of Google's own employees helped perpetuate it at Webmaster World. It's the idea that Google doesn't use Descriptions for ranking.
I'm quite sure that I had a page rank based on text that was nowhere in the page content and only in the Description. But you can do your own tests and get the results fairly quickly. I'm not saying they count for a lot, but more than nothing.
One webmaster put it well when he suggested the following distinction. Google may use meta tag Descriptions for indexing, but not ranking.
added in edit: googlebot has been downloading the entire site every day, the addition of the tag-free versions to cache seems a bit slow this time. I have also been churning the content a bit to stimulate the bot.
[edited by: OnlyToday at 2:03 pm (utc) on Mar. 22, 2008]
I hate to say it, but it could actually make sense. Google ranks pages/sites based (slightly, at least) according to geography. For example, I have one site that has great Inbound Links from Australia, and it's virtually at the top of the SERPs via Google.com.au.
The 950 Penalty takes into account IBLs, from what I've read here. Some have suggested new IBLs from authority sites can get the 950 lifted.
If Google has decided to fuse its local search algo into the 950 algo, you'd expect some sites to be 950d for some country-specific Google searches, but not necessarily all others.
It's also possible Google rolls out new rankings with the 950 lifted in foreign countries and then eventually all of them, incl. google.com, so the 950 is lifted everywhere.
P.S. Just finished removing all dupe content meta tags from one 950d site to see if it will lift the 950. Hoping for a universal resurrection!
What happened is I was working on a site last week to break the 950 penalty and forgot about it. Iíve broken the penalty before with sites. I was going very slow throttle with the changes. Apparently the site returned very much as you mentioned except in the US, UK, and Australia. A few keywords did recover in the US, UK, and Australia but not enough to get excited about. The changes were actually astounding in many countries outside of that. It showed to me that Google is using a number of filters for the US. The point though was not to change much of the optimization. Other sites also seemed to have the penalty lifted in these countries.
I want what the rest of the world has.