Welcome to WebmasterWorld Guest from 50.16.24.12

Have Google fixed proxy hijacks? My results are clean!

   
8:53 am on Oct 10, 2007 (gmt 0)

10+ Year Member



Hi,

This morning while looking to see if there were any extra proxy highjack pages in Googleís index for my site I got a very pleasant shock.

Zero
Nadda
Ziltch

Iíve done nothing to ban and I could not start lookups on my server so its not anything Iíve done.

Has anyone else noticed that the proxy page totals have disappeared could this be down to the authority patent that is being discussed on the October serpís changes thread?

[webmasterworld.com...]

Anyone else seeing this?

Vimes.

4:52 pm on Oct 10, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



That is indeed very sweet news. I am not seeing any proxy sites either right now - but there have been many sites affected, so I hope we hear from more people. If Google has indeed fixed this issue, it can only be a good thing for them and for webmasters.
4:55 pm on Oct 10, 2007 (gmt 0)

10+ Year Member



what is the best way to check for proxy hijacks?
5:13 pm on Oct 10, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



You see them when you check a search where your site normally ranks but now there's a proxy server url there in your place. So I'd just check your ranking for your best search terms - something many webmaster have as part of their routine.

You can also try inurl:example.com where example.com is your domain. Many, but not all, proxy servers will include your domain in the url somewhere.

5:44 pm on Oct 10, 2007 (gmt 0)

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Or do a search for a long exact phrase taken from your website. This is always a troubling thing to do, because you always find multiple matches.
5:56 pm on Oct 10, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



There's a difference between Google merely indexing a proxy url and having that url take over or hijack your rankings. If your rankings stay yours, that's all I care about. Otherwise it's like trying to keep every grain of sand out of a beach house.

Still - I'd love to hear other reports. Has anyone seen their previously hijacked positions come back to them?

3:35 am on Oct 11, 2007 (gmt 0)

10+ Year Member



I hope this is the end of it i've seen a huge increase in traffic this week which at first i thought was the monthly WMT link update, after seeing this i'm really looking forward to the next few months.

losing the the 'supplemental' status on these landing pages is a just what i needed, i read both of the patents last night and from what i'm surmising the authoritative document is addressing these proxy issues.

Anyone have any other ideas?

Vimes.

6:24 am on Oct 11, 2007 (gmt 0)

10+ Year Member



>> Tedster:

This could be related to what I saw on our sites.

Just that the pages previously hijacked have been removed from the Google index and can't be found at all.
At least for now.

The pages that recently disapeared on our sites were 2-3 y/o with significant links internally and from other quality sites.
They have been hijacked back 10 months ago and mostly came back little by little - although the GWT never showed them as part of the site anymore, they were cached and crawled.

Now our pages are gone again, and the proxy pages are also gone.
Looks like a no brainer for Google, remove both the proxy and hijacked pages just in case.

I still see some proxies with copies of other sites around. So not sure if this is what is really going on.

I'd love to hear from others and see if noticed the same thing or if what we, again, suffer from is totally unrelated.

7:02 pm on Oct 11, 2007 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Sorry to ruin the party but the proxy sites are alive and well.

Having a batch of them disappear is nothing unusual because many of these proxy sites are fly-by-night and are gone before you know it. This is because the value of the typical CGI proxy is only good until the schools and offices have it blocked, then they need a new domain name.

I see the following indexed for one of my sites in the 2 and 3 position for a special page I feed just proxy sites so they don't hijack real content:

www.proxy1.org/nph-page.pl/000000A/http/www.mydomain.com/
www.proxy2.com/asd/nph-proxy.pl/000000A/http/www.mydomain.com

...and a few more proxies further down the page, so yes they're still out there and Google is still indexing them.

[edited by: incrediBILL at 7:03 pm (utc) on Oct. 11, 2007]

7:15 pm on Oct 11, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Thanks, incrediBill - this is one party where we don't want any premature celebrations.

Can we go one step further on the discussion? Even though Google is still indexing proxy server pages, is there something new in the algo that keeps them from ranking so easily - at least for common searches? I think the recent relative silence on this previously noisy issue is interesting.

7:30 pm on Oct 11, 2007 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Google may be making progress because I easily found a few attempts to jack some WebmasterWorld pages that didn't show up in the SERPs whatsoever. Either they are no longer listing certain notorious proxy sites in the SERPs or filtering specific URL types but the pages are definitely in their index.

Shame I can't cite an example here.

5:14 am on Oct 12, 2007 (gmt 0)

10+ Year Member



With that info Bill my fireworks are definitely put back in the box :(

Yes a huge shift in pages, 15% drop in page totals for me in the last 24hrs on some DCís. Iíve noticed a small decrease in traffic alsoÖ.

Vimes.

12:07 pm on Oct 12, 2007 (gmt 0)

5+ Year Member



Bill,

As I mentioned on Dan's post, one of the main problems now is that the CGI proxy hackers can remove (and are removing) the meta robots noindex tag.

I've been doing some research and I think I came up with a good solution. Here is the basic idea:

1) Use a global shared blacklist of CGI proxies being used to hijack
2) Use detection code that checks every visitor against that BL and blocks CGI hijackers
3) Set up traps that catch the CGI proxies and update the blacklist

The good news is that the project honeypot http:BL already has everything we need. We only need to work with them to create honeypot (traps) designed to catch CGI proxy hijackers.

One simple way to create such trap is to have the honeypot page generate some encoded text with the IP (and other identification information) of the attacker. The page must be blocked by robots.tx and include the relevant meta robots tag noindex, nofollow. If there is a page that comes up for the encoded text, then it has to be a CGI proxy.

[edited by: tedster at 3:01 pm (utc) on Oct. 12, 2007]

2:18 pm on Oct 12, 2007 (gmt 0)

10+ Year Member



Sorry, guys, but no good news.

I don't see any significant changes on the larger scale at all. The good people at Google haven't done anything IMO.

They haven't even dealt with the obvious proxy scripts yet (nph,cgi-proxy,php proxy), which you can easily verify for yourself.

[edited by: tedster at 2:49 pm (utc) on Oct. 12, 2007]

3:12 pm on Oct 12, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



There are two topics getting mixed up together here:

1. This thread: Is Google fixing the proxy hijack problem?
2. Related topic: How can we defend against proxy hijacking? [webmasterworld.com]

I re-opened the linked thread where we can discuss our own defensive steps - and this thread can stay focused on Google's progress, or lack of it.

12:22 pm on Oct 19, 2007 (gmt 0)

5+ Year Member



From someone who started hacking long before he started learning about SEO - the idea presented as a solution of banning an IP is beyond silly it is stupid. Sorry the whole idea of a proxy is that you change IPs or servers OFTEN.
10:05 pm on Oct 19, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



You're right, Flex. Google's hijacking problem is created only when the REAL googlebot crawls a site through a proxy server url. But this point really belongs in the "how can we defend" thread.

As I said above:

1. This thread: Is Google fixing the proxy hijack problem?
2. Related topic: How can we defend against proxy hijacking? [webmasterworld.com]

I'm not sure what the silence here about Google making any progress really means. Is the issue getting handled?

 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved