Welcome to WebmasterWorld Guest from 188.8.131.52
It might be helpful to have in one place ideas , experiences and testimonials of concerning the revision of non performing sites [ based on links and content alone].
To kick things off:
[ things that might help - but not yet tested ] :
- overuse of meta title to cover too many aspects
- repetitive use of same word in internal navigation and linking
- removal of non site external links from within asite (or add no-follow)
- Remove site family inbound links [ even if different content ]
Can any folks here be more specific about aspects of de optimisation that are either of consideration or have served them or sites they observed well.
Also look at high and unnatural KW densities
You've highlighted more than one issue:
1. Removal of overoptimisation - keyword stuffing, etc.,
2a. Checking all your outgoing links (at least every couple of months), and removing decayed or broken links, and nofollowing where you are unsure of link safety.
2b. Removing overlinking from neighbours. I'd add remove or review non-related reciprocals, and review all reciprocals.
3. Check internal navigation.
If you have a sick site - one that fails to rank as well as expected, despite best efforts - then attention to those points (which are YOUR points!), will usually make a difference.
One key point I'd add to that is losing identical title and meta description; individualize for each page.
And check that you don't have multiple URLs for each page (a major cms issue).
Also, how safe is it to experiment with rewriting/deoptimizing content, and making title and link changes and so on? In other words if a person deoptimizes a page to an even lower position, will changeing everything back cause the page to rise again?
That all parts of the site can be reached quickly and easily, and there are no loops to trap SEs (and visitors).
That everything you intended to remove has been removed.
That you've reviewed your redirects, and they still work, if appropriate.
That you remembered to upload your new images (that's where I go wrong, every time!), and server side includes. And updated css files.
(this is where I get totally out of my depth .... )
I'd avoid the term 'deoptimisation', which is a little vague and may be misleading.
crush - what do you advocate [ in your annoyance :) ] , do nothing and be "lucky" where situations seem to have a chance of self remedy. Me thinks if the situation has gone on for a long time it's best to try something.
In June 2007 my ecommerce site started losing google traffic after the major update at that time. The site had been steadily rising in position and traffic for four years and had become one of the top six sites in its' niche (generating nearly £2m in sales) purely from lots of hard work optimising the site and a few good quality links. We'd never made big changes to the site in one hit, just gradual minor refinements/additions. I assumed we'd just tripped a filter somewhere.
Finding the cause was quite tricky though. I could see some product pages were no longer being found for terms that exactly matched phrases on the page and in the title (the product name), but we started getting traffic to those pages for other terms. So we lost traffic but gained some as well. The overall result however was a slight but noticeable loss. It looked to me like an over optimisation issue as google clearly didn't want the site to have traffic for terms it thought we were targetting, no matter how obscure.
In the months that followed I experimented with various de-optimisation approaches with no conclusive outcome. Meanwhile our traffic seemed to be constant but about 25% below our norm for the time of year. Then in mid-October 2007 things took a major turn for the worse and google traffic started disappearing at a rate that increased every day. Our main category pages were vanishing from searches by the day and eventually after a week our homepage went for our main term (although the site could still be found for the website name).
It was now clear I had a major problem to fix and not much to lose. I asked around and a few people mentioned some internal links I had in the footer to gift guide pages (21st birthday ideas and so on). I removed these links from the footer at the end of October 2007 and the next day when google had a new cache of my homepage, it was reinstated to its' previous position. As each day went by the pages returned and within one week our traffic had returned to full strength. But it didn't stop there, our traffic continued growing and eventually peaked at 20% more than we'd ever seen for a December (our peak business time - phew!).
The interesting thing is I was warned to remove the footer links two years before by an associate whose site had also been hit and recovered once the footer links were removed. So if the filter was in place two years before why hadn't we been affected before? It was like our site was okay until it hit a certain level of success, then came under closer scrutiny.
Anyway, at the start of 2008 I hired an SEO to see what other issues might be lurking. I found someone with a great reputation among people in the industry who I trust and who had produced excellent results in much more competitive niches than mine. He picked up on the following potential issues:-
1) Links from two other sites that I own to our ecommerce site
2) Duplicate content on the two other sites
3) Server location (was Canada, now US) - it's a UK site
4) Need for more quality links
He felt point 2 was not an issue for the ecommerce site (but may be for the other two sites), and point 3 may have some bearing on ranking but wasn't a major issue. Point 4 is always something to work on so that left point 1.
We run two affiliate sites (one focused on the same niche as the ecommerce site and one right across the board), but both offer price comparison between our products and those of our competitors (as an affiliate so we get commission if they buy from the competition). They deep link to product pages on our ecommerce site or that of our featured competitor.
I know linking between your own sites is theoretically a big no-no but the ecommerce site doesn't link back to the affiliate sites and the only difference between the competitor sites that they link to and our own ecommerce site is the Whois information. The sites are located on separate servers in different IP ranges. And these sites do generate visitors and sales for our ecommerce site so the links have a real and genuine purpose.
The SEO couldn't say for certain whether this interlinking would ever harm the ecommerce site or was harning it already. He didn't feel sure enough to recommend that I remove the links and suspected that it may well be helping while the affiliate sites had enough trust. He felt uneasy making a recommendation one way or the other.
Consequently we decided to leave the issues we weren't sure about and focus on improving the quality of the links to the site to build trust, move us up the rankings and hopefully make the site more resilient.
We chose some keywords and pages to focus on that had been around page 5 in the results to get them onto page 1, but within a couple of months we found the targetted pages were falling for the exact keywords we were targetting (while every other page remained unchanged). Eventually the targetted pages fell to page 10 or lower so we decided to call a halt to the link work. Google seemed to be resisting our efforts!
Over the course of 2008 this site has dropped slightly in google and the Alexa rank is also on a gradual slide. At it's peak in June 2007 it was ranked around 200,000 and now over 400,000.
At the moment there is nothing to clearly suggest google has a problem with the site as a whole and the only suspicious sign is one category page that has bounced in and out of the results all year like it's on the edge of a filter. Alot of the products on that page have similar names so there is alot of keyword repetition, but no more than some other pages that seem to be 'liked' by google.
Since November our traffic has been consistently 25% below what it was in 2007 but the economy and increased competition over the course of the year may be to blame for that, not necessarily a problem with google.
Sorry for the long post but I thought the history of how we solved our problem last year may help others, and maybe someone can help me figure whether we have a problem now and, if so, what we should do about it!
A guy we work with got his site -50'd because he was associated with us on DNS and IP's. We do the hosting for him. He spend 40,000 euros on content and many years making the site.
I told him give it 90 days, if it is not back then it is like pining for a ex girlfriend. Move content to new domain and start again but it is the damn limbo state that kills your business. He has had to sack a load of people and still does not know if all his hard work on, content, site development and link building will ever come back.
Google smashed him with a hammer and it would be nice if another blow is coming to the head, they going to leave you there bleeding to death on the floor or give you a hand back up and make friends again
We had about 30-40 links to internal pages in our footer which effectively added the same keywords to every single page of the site. I later found out I probably could have left them there if I changed/rearranged them across the site - it seems it's okay to have them in the footer as long as you don't repeat exactly the same thing on every page.
I considered reintroducing them, varying them across the site, but I think there is still an element of risk which I'm not prepared to take with this site.
On another page, Widget Model No. 198, the same anchor text, but the links take you to pages regarding Model No. 198, instead of No. 11. And I'm not talking about 30 or 40 links, more like 5 or 6.
Any thoughts on this? I have some pages gray barred and not listed as having internal links in Webmaster Tools, while other similar pages seem to have to problem. I've been fooling with this for months trying to fix the gray barred pages, with no success. Thanks in advance for any thoughts.
I understand that footer links could be potentially abused and thus that would bring a penalty, but it is still crazy. what next?
We didn't get grey barred though, ours looked like a -whatever filter/penalty and as soon as the footer links were gone our pages began returning to page 1 as soon as they had been re-spidered.
Your problem may be down to content. If you have a number of pages with very similar content which is only differentiated by size or colour then Google may regard those as dusplicates. If your content is unique across the web you're okay. If not, you'll see the grey bar.
Duplicate content isn't the only reason for a grey bar, but in my experience it's the main one.
wrkalot - the problem was that the footer contained a large block of text which was identical sitewide. They just happened be links.