falsepositive - 7:00 pm on Mar 3, 2011 (gmt 0)
The problem I have is whether I'm barking up the wrong tree. I continue to see scrapers publishing 100% of my content and appearing ahead of my work in the search results as I publish new content right now. I've already shut down my RSS feed and it's still happening?
The problem is that there are potentially many factors in question, but it's hard to get focus on which ones to address, given that the biggest one -- the duplication and syndication cannot seem to be stopped.
Without these red herrings, it would be great to be able to step back and assess the true issues of a site. At this point, I don't even know whether the appearance of small scraper sites ahead of all our content is an intended feature of this algo or whether it is a BUG.
So are the existence of scrapers a side effect that they expected and just did not care about, or is this a mistake? I'd like to know.