turbocharged - 3:12 pm on Jun 16, 2013 (gmt 0)
It really would be nice if Google did give some general feedback in GWT when your site experiences a ranking drop due to technical issues, bad links, or a content quality downgrade.
This would never fly. It would give black hats some ability to reverse engineer the algorithm to fine tune their ranking efforts.
it would force the spammers to produce quality content to rank.
First we must request that Google respects an original authors work and rights. At the present time it does not matter who authored the "quality content" as preference is given to where it appears at and the quality signals it has. If stolen content is on a blogspot, wordpress, tumblr, etc. domain, it may very well rank above the original author. It's not like Google does not have the information to detect this, but they allocate very little (if any) computing power to it. An example would be an entire page copied word for word two years after its first post on the original author's own website. The length of time indexed should be enough to let Google know which version to rank - but it's not. One of the DMCA notices we sent out this week was for this specific problem, where a two year old page was copied, posted on blogspot and ranking above the original authors website. The copied version was heavily linked to from a Google Plus account, which also appeared to be auto-generated. The client has a completely natural backlink profile, and most natural backlink profiles for small specialized businesses tend to be weak. This makes them easy targets.