I don't know if bhartzer is watching this thread, but his comments vis-a-vis his tests with anchor text would be valuable. I tend to think that laser-targeted anchor text is the signal being ignored, as it also fits with the recent announcement about over-optimisation.
Yes, my ecomm site has been hit by some sort of over optimisation penalty since 19th Feb. All the search phrases that have been targeted with anchor text fell from page 1 to page 2 and below.
I've just run a report on the number of external domains (not link counts) that point to my site with a certain keyword. The distribution is:-
KW11 2.39% **Domain Name**
The thing is that all of these keywords (except for domain name) have been demoted dramatically, even keyword 17 with only 1.33% of external domains having used that particular anchor text. Long tail searches are still appearing on page 1.
I can understand why KW1 would raise an OOP flag, but it seems like once a site has over stepped the mark with one keyword, they then mark the whole site as being over optimised.
Can anyone recommend the direction I take now:-
1. Leave it and wait to see if it recovers
2. Try to dilute the high %age anchor text links with lots of natural 'click here' 'domain url' type anchors
3. Start work on a new site, creating lots of great content and hope to gain links naturally.
4. Forget about developing my own ecommerce site and sell my products on ebay and amazon.
FWIW I am seeing the following elements being potential issues right now
1. Blog networks / whois / ownership, and eval of template based websites with contextual outbound links that are not authority
2. OOP potential for sites with a lower number of inbound brand / company links and www links. This seems to be based on a number of factors including the specific vertical
3. The number of total questionable higher pagerank links - even if relevant - that some websites have pointing at them.
4. The total recent rate of link building.
Number four seems to be a flag that allows Google to instantly spot who is building aggressively, and then algorithmically eval those links, but difficult to say the exact system / approach.
|I can understand why KW1 would raise an OOP flag, but it seems like once a site has over stepped the mark with one keyword, they then mark the whole site as being over optimised. |
We're seeing this in a very competitive sector.
SERPs for our KW1 dropped end of Feb.
Clear OOP on anchor text - based on lots of bad links (via previous SEO of course!)
But then KW2 & KW3 (with more natural anchor text) took a hit in the rankings.
This would seem to back up your theory wokka.
|2. OOP potential for sites with a lower number of inbound brand / company links and www links. |
This would seem to be the case & ties in with my eg above
|4. The total recent rate of link building. |
Definitely not seeing this.
I'm seeing the same thing. It's not just about "anchor-text being devalued" based because websites with natural, high quality, but overly repetitive anchor text links (guest posts, widgets, high quality press releases, etc) are still ranking high.
Websites that manipulated rankings thru low quality, high PR keyword-rich anchor text are all getting hit. It seems to me that the site's recent link acquisition rate is really what tripped something.
This has always happened in the past but it seems like google is emphasizing this even more than it used to. I can't help but think this is part of their efforts to prevent overly-SEO'd sites from dominating the rankings.
Query "search engine" without the quotes and this should tell you all you need to know about how they are dealing with anchor text.
As a side note, it also shows a generic example of how they are failing to return the most important/fresh/relevant pages these days largely because of recent changes to the algo (specifically regarding links).
| This 66 message thread spans 3 pages: < < 66 ( 1 2  ) |