Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

De-optimisation hints to restore sites from filters and penalties



4:02 am on Dec 13, 2008 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Having read a number of threads i saw many achieving success following "de optimisation" of sites involving content and links.

It might be helpful to have in one place ideas , experiences and testimonials of concerning the revision of non performing sites [ based on links and content alone].

To kick things off:

[ things that might help - but not yet tested ] :

- overuse of meta title to cover too many aspects
- repetitive use of same word in internal navigation and linking
- removal of non site external links from within asite (or add no-follow)
- Remove site family inbound links [ even if different content ]

Can any folks here be more specific about aspects of de optimisation that are either of consideration or have served them or sites they observed well.


6:12 pm on Dec 13, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

The first in your list are the most important, I'd say.


6:57 pm on Dec 13, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I would also reverse any sitewide internal navigation changes ( paticlarily keyword rich internal links especially to the indexpage, that were implimented before the filter/penalty was applied - this sorted the problem for us on a large site.

Also look at high and unnatural KW densities


7:37 pm on Dec 13, 2008 (gmt 0)

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member

I'd avoid the term 'deoptimisation', which is a little vague and may be misleading.

You've highlighted more than one issue:

1. Removal of overoptimisation - keyword stuffing, etc.,

2a. Checking all your outgoing links (at least every couple of months), and removing decayed or broken links, and nofollowing where you are unsure of link safety.

2b. Removing overlinking from neighbours. I'd add remove or review non-related reciprocals, and review all reciprocals.

3. Check internal navigation.

If you have a sick site - one that fails to rank as well as expected, despite best efforts - then attention to those points (which are YOUR points!), will usually make a difference.

One key point I'd add to that is losing identical title and meta description; individualize for each page.

And check that you don't have multiple URLs for each page (a major cms issue).


7:47 pm on Dec 13, 2008 (gmt 0)

10+ Year Member

If a person makes one of these changes, how long should they wait before either changing it back or making additional changes?

Also, how safe is it to experiment with rewriting/deoptimizing content, and making title and link changes and so on? In other words if a person deoptimizes a page to an even lower position, will changeing everything back cause the page to rise again?


5:54 am on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Check internal navigation.

What would you be checking for ?

[edited by: tedster at 9:36 am (utc) on Dec. 14, 2008]
[edit reason] member request [/edit]


12:45 pm on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member

That it works, mainly. :)

That all parts of the site can be reached quickly and easily, and there are no loops to trap SEs (and visitors).

That everything you intended to remove has been removed.

That you've reviewed your redirects, and they still work, if appropriate.

That you remembered to upload your new images (that's where I go wrong, every time!), and server side includes. And updated css files.

If you use a cms, then there's another level of checking to get the best out of the site. And javascript. Probably.
(this is where I get totally out of my depth .... )


5:23 pm on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I'd avoid the term 'deoptimisation', which is a little vague and may be misleading.

Thank you for saying that Quadrille -- I'd hate to think we've all spent 6 years tuning up our sites, and now should reverse the process. Reminds me too much of "dig a hole, then fill it up"...



6:06 pm on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member

SEO annoys me too - I've never optimized a search engine in my life ... I much prefer 'site optimization', and I'm even moving toward 'page optimization', lest there be confusion!


9:20 pm on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

You know what annoys me? You get filtered, get a time penalty, then do nothing to your site and finally 60 days later get reincluded if you are lucky. #*$! were you filtered for in the first place :)


11:04 pm on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

proboscis - interesting questions , i wonder if anyone can lend some insight on their experiences - shame to let this good question slip by.

crush - what do you advocate [ in your annoyance :) ] , do nothing and be "lucky" where situations seem to have a chance of self remedy. Me thinks if the situation has gone on for a long time it's best to try something.


8:37 am on Dec 15, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Yes Whitey, if you are not back in 90 days, try something, put what you have done and add to your reconsideration request.


3:20 pm on Dec 15, 2008 (gmt 0)

5+ Year Member

In 2007 I solved an issue with one of my sites which could be classed as deoptimisation or overcoming the 950 filter. But I may have another issue now!

In June 2007 my ecommerce site started losing google traffic after the major update at that time. The site had been steadily rising in position and traffic for four years and had become one of the top six sites in its' niche (generating nearly 2m in sales) purely from lots of hard work optimising the site and a few good quality links. We'd never made big changes to the site in one hit, just gradual minor refinements/additions. I assumed we'd just tripped a filter somewhere.

Finding the cause was quite tricky though. I could see some product pages were no longer being found for terms that exactly matched phrases on the page and in the title (the product name), but we started getting traffic to those pages for other terms. So we lost traffic but gained some as well. The overall result however was a slight but noticeable loss. It looked to me like an over optimisation issue as google clearly didn't want the site to have traffic for terms it thought we were targetting, no matter how obscure.

In the months that followed I experimented with various de-optimisation approaches with no conclusive outcome. Meanwhile our traffic seemed to be constant but about 25% below our norm for the time of year. Then in mid-October 2007 things took a major turn for the worse and google traffic started disappearing at a rate that increased every day. Our main category pages were vanishing from searches by the day and eventually after a week our homepage went for our main term (although the site could still be found for the website name).

It was now clear I had a major problem to fix and not much to lose. I asked around and a few people mentioned some internal links I had in the footer to gift guide pages (21st birthday ideas and so on). I removed these links from the footer at the end of October 2007 and the next day when google had a new cache of my homepage, it was reinstated to its' previous position. As each day went by the pages returned and within one week our traffic had returned to full strength. But it didn't stop there, our traffic continued growing and eventually peaked at 20% more than we'd ever seen for a December (our peak business time - phew!).

The interesting thing is I was warned to remove the footer links two years before by an associate whose site had also been hit and recovered once the footer links were removed. So if the filter was in place two years before why hadn't we been affected before? It was like our site was okay until it hit a certain level of success, then came under closer scrutiny.

Anyway, at the start of 2008 I hired an SEO to see what other issues might be lurking. I found someone with a great reputation among people in the industry who I trust and who had produced excellent results in much more competitive niches than mine. He picked up on the following potential issues:-
1) Links from two other sites that I own to our ecommerce site
2) Duplicate content on the two other sites
3) Server location (was Canada, now US) - it's a UK site
4) Need for more quality links

He felt point 2 was not an issue for the ecommerce site (but may be for the other two sites), and point 3 may have some bearing on ranking but wasn't a major issue. Point 4 is always something to work on so that left point 1.

We run two affiliate sites (one focused on the same niche as the ecommerce site and one right across the board), but both offer price comparison between our products and those of our competitors (as an affiliate so we get commission if they buy from the competition). They deep link to product pages on our ecommerce site or that of our featured competitor.

I know linking between your own sites is theoretically a big no-no but the ecommerce site doesn't link back to the affiliate sites and the only difference between the competitor sites that they link to and our own ecommerce site is the Whois information. The sites are located on separate servers in different IP ranges. And these sites do generate visitors and sales for our ecommerce site so the links have a real and genuine purpose.

The SEO couldn't say for certain whether this interlinking would ever harm the ecommerce site or was harning it already. He didn't feel sure enough to recommend that I remove the links and suspected that it may well be helping while the affiliate sites had enough trust. He felt uneasy making a recommendation one way or the other.

Consequently we decided to leave the issues we weren't sure about and focus on improving the quality of the links to the site to build trust, move us up the rankings and hopefully make the site more resilient.

We chose some keywords and pages to focus on that had been around page 5 in the results to get them onto page 1, but within a couple of months we found the targetted pages were falling for the exact keywords we were targetting (while every other page remained unchanged). Eventually the targetted pages fell to page 10 or lower so we decided to call a halt to the link work. Google seemed to be resisting our efforts!

Over the course of 2008 this site has dropped slightly in google and the Alexa rank is also on a gradual slide. At it's peak in June 2007 it was ranked around 200,000 and now over 400,000.

At the moment there is nothing to clearly suggest google has a problem with the site as a whole and the only suspicious sign is one category page that has bounced in and out of the results all year like it's on the edge of a filter. Alot of the products on that page have similar names so there is alot of keyword repetition, but no more than some other pages that seem to be 'liked' by google.

Since November our traffic has been consistently 25% below what it was in 2007 but the economy and increased competition over the course of the year may be to blame for that, not necessarily a problem with google.

Sorry for the long post but I thought the history of how we solved our problem last year may help others, and maybe someone can help me figure whether we have a problem now and, if so, what we should do about it!


4:20 pm on Dec 16, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Would be good if Google could give you some hint. I mean the best hint they give is when you get a grey bar and no cache, I would never pray for a site like that ot come back.

A guy we work with got his site -50'd because he was associated with us on DNS and IP's. We do the hosting for him. He spend 40,000 euros on content and many years making the site.

I told him give it 90 days, if it is not back then it is like pining for a ex girlfriend. Move content to new domain and start again but it is the damn limbo state that kills your business. He has had to sack a load of people and still does not know if all his hard work on, content, site development and link building will ever come back.

Google smashed him with a hammer and it would be nice if another blow is coming to the head, they going to leave you there bleeding to death on the floor or give you a hand back up and make friends again


6:10 pm on Apr 8, 2009 (gmt 0)

10+ Year Member

claaarky - can you expand on what exactly was the issue with the footer links? You mentioned that removing footer links help to solve that problem...but what was the issue with the footer links that may have caused the penalty/problem?


8:16 am on Apr 9, 2009 (gmt 0)

5+ Year Member

teenwolf - the problem was what Google calls boilerplate repetition.

We had about 30-40 links to internal pages in our footer which effectively added the same keywords to every single page of the site. I later found out I probably could have left them there if I changed/rearranged them across the site - it seems it's okay to have them in the footer as long as you don't repeat exactly the same thing on every page.

I considered reintroducing them, varying them across the site, but I think there is still an element of risk which I'm not prepared to take with this site.


1:18 pm on Apr 9, 2009 (gmt 0)

10+ Year Member

claaarky: Could you expand a bit on footer links? What if you have footer links with the same anchor text, but they go to different pages? For instance, on a Widget Model No. 11 page, you could have footer links for Colors, Sizes, Blueprint, etc., using "Colors," "Sizes," and "Blueprint" for anchor text.

On another page, Widget Model No. 198, the same anchor text, but the links take you to pages regarding Model No. 198, instead of No. 11. And I'm not talking about 30 or 40 links, more like 5 or 6.

Any thoughts on this? I have some pages gray barred and not listed as having internal links in Webmaster Tools, while other similar pages seem to have to problem. I've been fooling with this for months trying to fix the gray barred pages, with no success. Thanks in advance for any thoughts.


1:45 pm on Apr 9, 2009 (gmt 0)

10+ Year Member

claaarky - thanks for the explanation. it's crazy that google could be penalizing for what could simply be good, useful navigation.

I understand that footer links could be potentially abused and thus that would bring a penalty, but it is still crazy. what next?


2:10 pm on Apr 9, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

What if the footer links are just basic site info?

home page ¦ privacy policy ¦ about site ¦ contact site


2:20 pm on Apr 9, 2009 (gmt 0)

10+ Year Member

Wouldn't rel=nofollow have worked instead of removing the links completely?

About 30 days ago I added it to all of my header and footer links that went to aboutus.html type pages. I have seen no effect yet... just wondering.


2:30 pm on Apr 9, 2009 (gmt 0)

5+ Year Member

AndyA - I think the problem is not the number of links or what the anchor text is, it's having the same footer links sitewide that puts you in peril - it's an easy way to stuff keywords into every page of a site so obviously it's something Google is not going to reward. If I were Google I would ignore Home ¦ Privacy ¦ etc. and look for signs you were trying to stuff keywords related to the topic of your site.

We didn't get grey barred though, ours looked like a -whatever filter/penalty and as soon as the footer links were gone our pages began returning to page 1 as soon as they had been re-spidered.

Your problem may be down to content. If you have a number of pages with very similar content which is only differentiated by size or colour then Google may regard those as dusplicates. If your content is unique across the web you're okay. If not, you'll see the grey bar.

Duplicate content isn't the only reason for a grey bar, but in my experience it's the main one.

wrkalot - the problem was that the footer contained a large block of text which was identical sitewide. They just happened be links.


2:50 pm on Apr 9, 2009 (gmt 0)

WebmasterWorld Senior Member ken_b is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Did I miss this?

Only do one (1) thing at a time.

And then give it time to work (probably weeks or months, not days).

Otherwise you'll never know what worked, or didn't work.


Featured Threads

Hot Threads This Week

Hot Threads This Month