Forum Moderators: Robert Charlton & goodroi
If scraped copies outrank you, it could be a sign that your site has been penalized by either Panda or Penguin.
For a proxy to outrank you requires some coordinated effort.
Probably the reason that Google products are being used in this case isn't that they are favored by Google... it's that they are free.
I don't want a user agent name change to occur in the future and allow the client site to be copied again.
In these days where Google is really driving home the importance of creating quality and compelling content, it's astonishing that they have allowed so many webmasters to be victimized by their own service.
But this problem should really have never existed from the start.
It seems clear at this point that Google basically views the web first as a collection of data or content, and only secondarily as a collection of intellectual property.
But it's grown beyond that, and lowered the quality of their SERPS.
If scraped copies outrank you, it could be a sign that your site has been penalized by either Panda or Penguin
and where is Penguin and Panda to penalize the scraper site with their thin content and black hat webspam?
Why a Google owned proxy is allowed to cache anothers content is disturbing.
Injecting code is far different than securing a service with a noindex tag that protects the greater good of the internet community. I see no legal issues with Google restricting their services for lawful acts by preventing their proxies from being indexed and cached. That is the responsible thing to do.