Future - 11:32 am on Aug 9, 2010 (gmt 0)
I thought I was only one thinking about same.
Google is getting original source attribution wrong more often than they were, ranking the scraped or mashed-up URL and filtering the original. My theory?
Our site(s) content is completely available on blogspot and wordpress subdomains, after reporting those are removed but cannot control this atuomatically.
Only 1 way found was to disable rss feature.. but still scrappers copy/paste entire pages effecting our site very badly.
tedster is there anyway we can keep a watch on the guests continously harvesting our website ?
some tool which can track this ?