turbocharged - 1:10 am on Apr 26, 2013 (gmt 0)
If scraped copies outrank you, it could be a sign that your site has been penalized by either Panda or Penguin.
We just changed the content, pinged Google immediately after posting it and the Appspot proxy copy was cached first. Taking a snippet from the new page's text, the client site appears in Google's omitted results, but its cache is that of the old page. I'm not sure why this is, but what Google is displaying for the client page in the SERPS does not match the cache. The Appspot proxy page now returns a 405 header response when accessing the client's site for any page.
For a proxy to outrank you requires some coordinated effort.
I completely agree. To substantiate this claim, the app's name, which is passed to the log files, specifically references the client's company name in a derogatory manner on a fair number of these Appspot proxy URLs. There is no doubt in my mind that the creation of this duplicate content is intentional and malicious. None of the Appspot proxy URLs I checked have any backlinks.
Probably the reason that Google products are being used in this case isn't that they are favored by Google... it's that they are free.
Indeed free is appealing to the unethical. There's few tracks for the victims to trace. But in doing my research regarding Appspot and Blogspot problems, the black hatters claim that they can rank Blogspot blogs with just a few link "blasts." It appears the black hatters are split between Blogspot and Tumblr as to which one is the easiest to rank with spam. I don't know where Appspot falls at in this mix, but from the complaints I have seen from others with similar problems I can only assume that anything residing on the Appspot domain is quite strong without any backlinking.
When I get time, I will have to learn more about Appspot and how to create proxies. If nothing else, I really want to learn what would justify calling an external page without an appended noindex tag. People have been complaining aboutAppspot proxy hijacking for years, and it's a security vulnerability that has harmed many webmasters and potentially many more. Google could/should append a noindex meta tag and the problem would be solved.
The client rejected our request to bring a 100% dedicated SEO on at this point. :( Because our hourly rate is cheaper, he expects us to provide the same level of service/care for half the price. While I don't mind the work, it takes us away from completing the design aspects of this project and will have a notable impact on completing the tasks which we were originally hired to do.