backdraft7 - 3:35 am on Apr 29, 2013 (gmt 0)
There's no Google reason that I've ever read for a site to be ranking well for years (or at least months) and then have another newcomer site bump it with a scraped version.
And you never will.
I'd really love to hear some of those "powerful reason" for hiding the domain registrant.
The only thing I will agree with you on ted is that Google (or any SE for that matter) is not allocating ANY resources for identifying real content authors. It's one of the biggest reasons the serps are such a mess today. Entire site history's exist on Alexa's WBM. I'm sure G collects similar. The data is there and it's generated every single day. Let's not underestimate the power of the system and make excuses for their lack of concern.
At the very least, private domain name registrations should be looked at very carefully and included as a metric when determining quality. I could write an algorithm in 5 minutes that would severely impact scraper proliferation and save a LOT of honest businesses. Failure of SE's to do so, is no different than them throwing up a libelous billboard on a highway that destroys an honest business. They must be made aware of the harm they cause by their lack of action.
Not to wax melodramatic, but: If I see an accident where someone is dying on the side of the road, and I don't stop to try to give aid, that would be wrong, criminally wrong in fact. The thought I'm getting here is that they should not stop to help (or improve their detection of legit authors) because it's simply none of their concern. That's a very inhuman and sad point of view. But certainly not surprising.