Welcome to WebmasterWorld Guest from 54.198.221.13

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

302 Redirects continues to be an issue

     
6:23 pm on Feb 27, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 27, 2005
posts:93
votes: 0


recent related threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]



It is now 100% certain that any site can destroy low to midrange pagerank sites by causing googlebot to snap up a 302 redirect via scripts such as php, asp and cgi etc supported by an unseen randomly generated meta refresh page pointing to an unsuspecting site. The encroaching site in many cases actually write your websites location URL with a 302 redirect inside their server. This is flagrant violation of copyright and manipulation of search engine robots and geared to exploit and destroy websites and to artificially inflate ranking of the offending sites.

Many unethical webmasters and site owners are already creating thousands of TEMPLATED (ready to go) SKYSCRAPER sites fed by affiliate companies immense databases. These companies that have your website info within their databases feed your page snippets, without your permission, to vast numbers of the skyscraper sites. A carefully adjusted variant php based redirection script that causes a 302 redirect to your site, and included in the script an affiliate click checker, goes to work. What is very sneaky is the randomly generated meta refresh page that can only be detected via the use of a good header interrogation tool.

Googlebot and MSMBOT follow these php scripts to either an internal sub-domain containing the 302 redirect or serverside and “BANG” down goes your site if it has a pagerank below the offending site. Your index page is crippled because googlebot and msnbot now consider your home page at best a supplemental page of the offending site. The offending sites URL that contains your URL is indexed as belonging to the offending site. The offending site knows that google does not reveal all links pointing to your site, takes a couple of months to update, and thus an INURL:YOURSITE.COM will not be of much help to trace for a long time. Note that these scripts apply your URL mostly stripped or without the WWW. Making detection harder. This also causes googlebot to generate another URL listing for your site that can be seen as duplicate content. A 301 redirect resolves at least the short URL problem so aleviating google from deciding which of the two URL's of your site to index higher, more often the higher linked pagerank.

Your only hope is that your pagerank is higher than the offending site. This alone is no guarantee because the offending site would have targeted many higher pagerank sites within its system on the off chance that it strips at least one of the targets. This is further applied by hundreds of other hidden 301 permanent redirects to pagerank 7 or above sites, again in the hope of stripping a high pagerank site. This would then empower their scripts to highjack more efficiently. Sadly supposedly ethical big name affiliates are involved in this scam, they know it is going on and google adwords is probably the main target of revenue. Though I am sure only google do not approve of their adsense program to be used in such manner.

Many such offending sites have no e-mail contact and hidden WHOIS and no telephone number. Even if you were to contact them, you will find in most cases that the owner or webmaster cannot remove your links at their site because the feeds are by affiliate databases.

There is no point in contacting GOOGLE or MSN because this problem has been around for at least 9 months, only now it is escalating at an alarming rate. All pagerank sites of 5 or below are susceptible, if your site is 3 or 4 then be very alarmed. A skyscraper site only need create child page linking to get pagerank 4 or 5 without the need to strip other sites.

Caution, trying to exclude via robots text will not help because these scripts are nearly able to convert daily.

Trying to remove a link through google that looks like
new.searc**verywhere.co.uk/goto.php?path=yoursite.com%2F will result in your entire website being removed from google’s index for an indefinite period time, at least 90 days and you cannot get re-indexed within this timeline.

I am working on an automated 302 REBOUND SCRIPT to trace and counteract an offending site. This script will spider and detect all pages including sub-domains within an offending site and blast all of its pages, including dynamic pages with a 302 or 301 redirect. Hopefully it will detect the feeding database and blast it with as many 302 redirects as it contains URLS. So in essence a programme in perpetual motion creating millions of 302 redirects so long as it stays on. As every page is a unique URL, the script will hopefully continue to create and bombard a site that generates dynamically generated pages that possesses php, asp, cigi redirecting scripts. A SKYSCRAPER site that is fed can have its server totally occupied by a single efficient spider that continually requests pages in split seconds continually throughout the day and week.

If the repeatedly spidered site is depleted of its bandwidth, it may then be possible to remove it via googles URL removal tool. You only need a few seconds of 404 or a 403 regarding the offending site for google’s url console to detect what it needs. Either the site or the damaging link.

I hope I have been informative and to help anybody that has a hijacked site who’s natural revenue has been unfairly treated. Also note that your site may never gain its rank even after the removal of the offending links. Talking to offending site owners often result in their denial that they are causing problems and say that they are only counting outbound clicks. And they seam reluctant to remove your links....Yeah, pull the other one.

[edited by: Brett_Tabke at 9:49 pm (utc) on Mar. 16, 2005]

10:31 pm on Mar 11, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 20, 2003
posts:197
votes: 0


So, someone could try to make a loop of 302 redirections of his url where the end and final content should be on the fifth position.

the best suggestion to protect your site yet.
its risky though if google decides to penalize 302's at some point you'd be in trouble.

this also probably won't fix sites that have already been hijacked.

10:41 pm on Mar 11, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:June 13, 2004
posts:650
votes: 0


this also probably won't fix sites that have already been hijacked.

Actually it should.
Most of the hijacked sites are still crawled and as soon as offending site lose "content", the real one should be back.

The main question remains if Google implemented it.

10:42 pm on Mar 11, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3444
votes: 1


I have been googlejacked for about 6 month now, I just want to know if your situation is the same, here goes:

Pages indexed droped 80%, ofcause visits also droped 98%, my PR came back 2 month ago, but no changes in serps, googlebot has also not seen much of my site, even if I have cookies for it :), I also can not see my site when I ad the 0filter, so I think I will first see improvement when googlebot coming back for good.

10:43 pm on Mar 11, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 8, 2004
posts:865
votes: 0


I'm not sure if there is a society of professional webmasters or not, but a credible watchdog group comprised of disinterested industry professionals looking out for our collective interests would be a good idea and could have an impact - a webmasters union of sorts. Remember, none of the engines would make a dime if they did not have FREE access to the collective body of our sites and published online work.

Rollo in msg #:235 of this thread I made a brief reference to something quite similiar, although it recieved no response. Too many people trying to cut down trees in a forest of problems. We cant constantly wait for large corporations and politicians to get around to doing anything about problems they create in the first place. The idea of a union sounds pretty brutal but so does the idea of years of hard work and your livelihood disappearing in an instant.

It is quite a way to look at things, search engines would go out of business if we blocked them from indexing our sites. How about everyone in the theoratical union blocked all search engines but union certified ones, and of course, to become a union certified search engine you have to show union members websites at the top of the listings and not some fly-by-night scraper sight. Your right, were the webmasters, were the ones creating content, why are they the ones in charge. What right do they have to show our content under anothers name, none.

10:50 pm on Mar 11, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 8, 2004
posts:865
votes: 0


sja65 - It isn't just little sites this is happening to.

Reading your post reminded me of something I had completely forgoten about. About 4 months ago I was looking around at DVD rental sites comparing deals and I came across one that had some other name but it was the blockbuster website. It wasn't at the top of the serps but about 20 down. I spent about 5 minutes scratching my head trying to figure out why blockbuster would allow someone else to use their name site that, seemed very odd. I had completely forgoten about it until I read your post about amazon, err, big river site.

10:51 pm on Mar 11, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:May 14, 2003
posts:164
votes: 0


activeco, while the idea is a good one in principle, it falls down when you consider that it would have no effect on the google bot.

While the theory would be to redirect when the bot is arriving via a 302 redirect, google's bot won't have this referrer information when it views the page, and there's no way the server could detect that the link was using the 302 method.. AFAIK

10:57 pm on Mar 11, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:June 13, 2004
posts:650
votes: 0


While the theory would be to redirect when the bot is arriving via a 302 redirect, google's bot won't have this referrer information when it views the page, and there's no way the server could detect that the link was using the 302 method.

It has nothing to do with referrer or any other header info.
You simply do your own 302 redirects five times in the raw, with the last one having the content.
If someone does 302 to you he can't pass all the 302's in the loop, of course if "five redirections" rule being implemented.

11:00 pm on Mar 11, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:May 14, 2003
posts:164
votes: 0


Thanks for the clarification.

This isn't something I'm expert on, but would there be any negative impact for the users / bots if they went through a 5 stage redirect for every page they accessed?

11:07 pm on Mar 11, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:June 13, 2004
posts:650
votes: 0


Some delay for sure, but I think it is a marginal problem.
11:23 pm on Mar 11, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


also if the offending site did it's own redirect to the same page your five deep are redirecting to, then what?
11:24 pm on Mar 11, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:June 13, 2004
posts:650
votes: 0


Please see above about dynamic url changes (#314).
11:34 pm on Mar 11, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 20, 2003
posts:197
votes: 0


activeco: this seems like a very very good suggestion the more than i think about it.

has any testing been done?

11:37 pm on Mar 11, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 8, 2005
posts:118
votes: 0


I don't know if its related but after I emailed G last night, googlebot visited over 300 pages of my site at around lunchtime today. I thought it may have been because of an old sitemap that i uploaded but the only ip to access that file was mine.

*Crosses Fingers*

11:39 pm on Mar 11, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


I think the only thing google can do (and should do) about this is treat cross-domain 302's differently.
Sure there are a lot of people with multiple domains but they would just have to adapt, at least they have control of their own domains.

That would sure throw a wrench into the spammers/hackers works, they ALL have multiple domains with 302's all over the place. It's part of what they do.

11:47 pm on Mar 11, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


If they don't deal with this then we only need to wait for a few more big names to go down which is inevitable, and then they will have to deal with it in a hurry.
This 713 message thread spans 48 pages: 713
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members