falsepositive - 7:49 pm on Mar 15, 2011 (gmt 0)
Content_ed, from what you've described and what I am seeing in some cases, it may be that the scrapers show up as a symptom of weakness, and not as the cause of weakness. I go back and forth on this, but I have seen some evidence that strong sites with scrapers are not affected. So I am to think that if we focus on getting our sites stronger, then I expect that "weight" to be lifted, enough to dislodge all the scrapers who are ahead.
That said, I found out my site had accidental hidden text. Wow. My tech guy who is also my spouse, left some hidden text on my headers by accident. They were not spammy text and are rather innocuous, but who knows -- given that every little thing seems to count it seems. I am now paranoid about what else is lurking under the site sheets at the moment and I guess the best thing is to do a full on site audit for these things!
I'm surprised though that if this were so bad (ie. against guidelines) that I was not delisted a while back. So I am not sure how much a factor this is as i had this stuff for about a year now. My site is also authoritative, with strong backlink profile, old, etc, so could that be enough not to get a stronger penalty? Or is it the case that certain hidden text is not considered too bad (if it was seen as accidental, not using strong keyword phrasing, etc)?