Penguin 1.0 obviously took a very strong stance against anchor text overoptimization and required that we spend more time focusing on anchor text diversity and a stronger weighting toward obtaining brand based terms (url and company name variances) as anchor text.
With the onset of Penguin 2.0, how has this changed?
My first thoughts are that Penguin 2.0 may have a greater focus towards leveraging the disavow data and similar to go after webmasters for having unnatural links ON their website. This in turn results in a mass devaluation of links from "untrusted" domains across the web. This seems to be more of a play to put fear into webmasters for selling/placing links, and damaging the link market.
I think that Penguin 2.0 may also apply more stringent thresholds for anchor text diversity and identifying links from advertorials, etc... but I think this second round of Penguin is focused more on tackling the link market at its source versus identifying link buyers.
I'm not stating that any of this is fact, just some ideas.. I just want to see some other thoughts on the topic and how this is impacting your link development strategies.
[edited by: phranque at 6:09 pm (utc) on May 24, 2013] [edit reason] edited title & typo (Panda -> Penguin) [/edit]
disclosure..I don't build links..I launch a new site with one or two links ( sometimes a few more, but never more than 10, and never all at once ) from somewhere(s)* else that I own ..and then wait for the other links to happen..
Works just fine..( all search engines ) since 1998..:)
* apologies to Lucy ..tacky grammar.. I know..I can't remember how to say that in English :(
My link development strategy has not changed at all. When it was announced the update was coming, I looked at what Matt Cutt's said. Basically, I was scared that guest posting was going to be hit hard, but thought about it, and realized, if your building value and unique content withing your link development, you should be fine.
to go after webmasters for having unnatural links ON their website.
Unnatural links ON a site (instead of TO a site) is a focus worth investigating in light of Sprint getting slapped [seroundtable.com] for hosting spam links on their own forums. That action was closely preceded by a similar penalty against Mozilla [seroundtable.com], and way back in March with a penalty against a BBC News page (called a granular penalty). Could those have been test runs of a web-wide penalties against sites hosting links ON their sites?