fathom - 2:20 pm on Aug 14, 2013 (gmt 0)
I completely agree here. For a small webmaster it is painful for me to take time out of my schedule and go in the painstaking process of identifying scrappers, then contacting them individually to take down the content, following up with them and finally when they dont respond I report them to Google with the disavow tool.
The disavow tool is about disavowing links to YOUR PAGES of content not disavowing pages of content established on other domains.
You'll be happy to know that "scrapper sites" don't have a lot of trust signals to devalue your pages and if they happen to have more than you you have bigger problems... e.g. your content isn't all that original and likely not the first version Google had in its archive.
How difficult is it for Google to develop a standard and then push it for popular publishing platforms which allows content creators to ping with the content they have created. So that if anyone scrapes them Google knows its a duplicate.
I understand fat pings have been around since some time but Google has not approved / denied them.
With the amount of tools that are available it does not take a smart person a couple of hours to rip apart and clone a 8 - 10 year website.
Just because Google doesn't acknowledge a term publicly that you are familiar with [like fat pings] doesn't mean their own term PANDA in its massive amount of developmental changes over 2 years via Updates #1 through #26 doesn't incorporate their ideals of the same thing.
They just wouldn't go into too much details so anti-PANDA countermeasures can exploit the newest vulnerabilities.