Welcome to WebmasterWorld Guest from 126.96.36.199
Accessing my site from China will show a 404 error. (To prevent copying, which clearly failed in this case. But at least they are not copying the new content uploaded recently.)
Do you guys think think is the reason why Google favors the scrapper site and treat mine as duplicate content?
I've speculated elsewhere (including our Updates thread [webmasterworld.com]) that since the Mayday update, Google is getting original source attribution wrong more often than they were, ranking the scraped or mashed-up URL and filtering the original. My theory? It comes from Mayday giving good rankings to "sites" they feel are more popular - and therefore better over all destinations for the search user. The emphasis used to be more on the "page" rather than the "site".
Anyone knows how long google takes to respond these kind of things?
What program or sites are people using to discover if their content has been scraped / copied?
It's a tough problem, I'll give them that, but it used to be better than it is.Actually, it's a pitifully simple problem but Google are just too dumb or too stubborn to fix this problem properly.
However, what you have to understand is that Google don't care.