Welcome to WebmasterWorld Guest from 22.214.171.124
Many people think they are being proxy hijacked that aren't and lately I've been getting a lot of email asking for help and it always turns out to be some other SEO trick and not proxy hijacking.
What is a Proxy Hijacking?
You find your content in the Google SERPs but the link to the site is something like:
When you click on that link with your content in Google you see your web pages displayed with the proxy URL in the browser's URL bar and/or it redirects to your site. The problem is the proxy has technically hijacked your content since their domain claims ownership instead of your site.
Current State of Proxy Hijacking
I'm still seeing Googlebot crawl via proxy servers but the resulting proxy hijacked pages stopped showing up a few months ago. I have code on my server that sends different results when Google crawls via a proxy server that makes it easy to detect in the Google SERPs but these results are no longer being found.
Additionally, I went to Google and checked a few proxies that I've been tracking along with some new ones and some are no longer indexed in Google whatsoever and others only respond with 10 pages indexed total instead of the thousands of pages previously returned many months ago.
Proxy Hijacking Summary
Therefore, based on all of the evidence presented, I'm drawing a conclusion that Google Proxy Hijacking may be a thing of the past and no longer a clear and present danger to webmasters.
I think this is no longer a problem and welcome anyone to sticky me if they think they're still seeing proxy hijacking in order to review the situation in details.
Unless I see some compelling stickies, I think this problem is solved, case closed.
Moderator note: As incrediBILL is a moderator, we hare making an exception to the Google Search forum prohibition against requests for stickies in this case.
[edited by: Robert_Charlton at 11:44 pm (utc) on April 23, 2008]
I've been a stalwart on the issue of Google Proxy Hijacking for quite some time including lobbying for methods to avoid the problem...
That is a modest understatement, to say the least.
Thanks incrediBILL for your constant pursuing of a solution to this most disturbing problem.
I vote for a sticky here.
I find this a particular problem because I often like to block a directory with robots.txt for a few days or weeks, and only 'open it up' to search engines once any bugs are fixed - however the proxy server may have got there first, and Google still seems to think that this means that the proxy domain is the original source of the content (based on indexing date, I assume) rather than mine (the original).
Has anyone else observed this?
Thank you very much for your efforts and for sharing your findings with us all the time, incrediBILL. Does it also apply to this apparently new form of hijacking, which we recently discussed at length in this thread [webmasterworld.com]?