| 6:58 pm on Apr 25, 2011 (gmt 0)|
Build new links with zero optimization.
If before you were using:
build links to:
Use like 10 different variates at least.
| 7:11 pm on Apr 25, 2011 (gmt 0)|
thank you for your reply.
Have you had this issue yourself and if so how long did it take to remedy?
| 7:32 pm on Apr 25, 2011 (gmt 0)|
How do you go about building links? Given that "freely given editorial citations" are what Google values the most, this is a key issue.
| 12:51 am on Apr 26, 2011 (gmt 0)|
|Have you had this issue yourself and if so how long did it take to remedy? |
Yes, I always try to make sure that people don't just link to the title of my page. I just send a quick email and people are usually obliged to change it, if I offer them some value in the email. Like finding a broken link on their site or a grammar error.
I found it to take a few months to come back from over optimization.
| 1:15 am on Apr 26, 2011 (gmt 0)|
I had that problem and eventually fixed it successfully. But I am not sure it was about over-optimization or paid links. Ranking had lost for one major keyword - from #8 to #28. I tried to remove or change the anchor text backlinks from other websites but to no avail. It was only after I removed 2 paid links which didn't have editorial citations, my site got back its original spot #8.
| 9:15 am on Apr 26, 2011 (gmt 0)|
I have experimented a lot with OOP and find it is quite simple to get out of in comparison to the other penalties.
My first step would be to remove 95% of your keyword from the target page, so if your targeting "blue widget" remove most repetitions, remove from H1, title etc, you can use "widget" and "blue" just not the main term.
I would also go through all backlinks that contain this anchor text and create variations, so "widget tools here" for example.
I would also build links to that page that have no relevance at all to the term.
| 10:03 am on Apr 26, 2011 (gmt 0)|
Thanks for the reply, my only issue with de-optimising the on page factors is that you are potentially harming keyword variations that aren't affected by the OOP and also it will make your listing in the SERPS less relevant to the users query, so will potentially de-crease the CTR of related phrases where we are still ranking for...
I'm not saying I doubt your findings, I'm just not sure I want to mess with the on-page stuff..
How long have you typically found it to take to recover the filtered terms?
| 11:22 pm on Apr 26, 2011 (gmt 0)|
I had one site I got out of that type of penalty by looking at 'where' the backlinks were from. Many were well beyond the niche's neighborhood and quite a few were not in the adjacent related niches. The person assigned the linking task only used 2 anchor text keyword variations and never the URL.
After removing ALL of the links that were well beyond the niche's neighborhood (totally unrelated) and most of the spammy looking 'adjacent neighborhood' links did the site bounced back.
The person doing this linking never triggered a un-natural link acquisition response in WMT too. So I didn't get wind of it until the customer made a comment about a 'grey bar' PR Toolbar rating. It took quite a few weeks for the site to return to PR3 from a 'grey bar' PR0 rank.
| 3:24 pm on Apr 27, 2011 (gmt 0)|
|OK, so an over optimization filter was triggered by excessive use of keyword rich anchor text in our backlinks. |
Out of curiosity how do you know for sure its an over optimization filter, was this confirmed in any way or have you just been told this,
As i suffered and still suffer slightly 6 months on to a very similar issue where just a few keywords do not rank well but mine was self inflicted with a coding error and i was told a number of reasons why including what your asking but after months of trying to fix something that was not there, i found the true issue
| 6:03 am on Apr 28, 2011 (gmt 0)|
On your Google Webmaster Tools, can you still see data in the Links To Your Site? Has it decreased or completely wiped out?
| 7:26 am on Apr 28, 2011 (gmt 0)|
I was subject to an OOP once on an affiliate site due to a mix of low quality repetitive anchor text links and thin content that was over-optimised. The penalty kicked in when the site designer replaced a bunch of straplines (non-keyword headers) with images for the text. I guess the percentage of certain words together with the links was too high.
De-optimising the page worked. Then, as others have said, we got a bunch more links with other non-related terms in them.
I haven't worked with those sort of sites or links for some years now, but I've heard fairly recently from an ex-colleague who still does that Google still works the same way (N.B. this is pre-Panda).
If changes to the page weren't the straw that broke the camel's back, then de-optimising that might not help, but it's the first thing I'd try. If it's links that are the problem and they are low-quality, you might find that Google takes time to notice them - so you might have over-stepped the mark several months before you see the penalty, and you'll have to wait a similar length of time for Google to assimilate a new link graph and restore your rankings.
| 11:35 am on Apr 28, 2011 (gmt 0)|
If you add too much salt to your soup, the only fix is to add more soup.
More links with increased diversity.
| 3:18 pm on Apr 28, 2011 (gmt 0)|
Hi - Google webmaster tools is fine, the site isn;t penalised - it's just filtered for a set of specific keywords - variations of the filter keywords rank fine.
I'm pretty sure it is too much repetitive anchor text, as the post above said, we added too much salt - so we are now adding more soup!