homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 467 message thread spans 16 pages: < < 467 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 16 > >     
Google's 302 Redirect Problem

 4:17 pm on Mar 25, 2005 (gmt 0)

(Continuing from Google's response to 302 Hijacking [webmasterworld.com] and 302 Redirects continues to be an issue [webmasterworld.com])

Sometimes, an HTTP status 302 redirect or an HTML META refresh causes Google to replace the redirect's destination URL with the redirect URL. The word "hijack" is commonly used to describe this problem, but redirects and refreshes are often implemented for click counting, and in some cases lead to a webmaster "hijacking" his or her own URLs.

Normally in these cases, a search for cache:[destination URL] in Google shows "This is G o o g l e's cache of [redirect URL]" and oftentimes site:[destination domain] lists the redirect URL as one of the pages in the domain.

Also link:[redirect URL] will show links to the destination URL, but this can happen for reasons other than "hijacking".

Searching Google for the destination URL will show the title and description from the destination URL, but the title will normally link to the redirect URL.

There has been much discussion on the topic, as can be seen from the links below.

How to Remove Hijacker Page Using Google Removal Tool [webmasterworld.com]
Google's response to 302 Hijacking [webmasterworld.com]
302 Redirects continues to be an issue [webmasterworld.com]
Hijackers & 302 Redirects [webmasterworld.com]
Solutions to 302 Hijacking [webmasterworld.com]
302 Redirects to/from Alexa? [webmasterworld.com]
The Redirect Problem - What Have You Tried? [webmasterworld.com]
I've been hijacked, what to do now? [webmasterworld.com]
The meta refresh bug and the URL removal tool [webmasterworld.com]
Dealing with hijacked sites [webmasterworld.com]
Are these two "bugs" related? [webmasterworld.com]
site:www.example.com Brings Up Other Domains [webmasterworld.com]
Incorrect URLs and Mirror URLs [webmasterworld.com]
302's - Page Jacking Revisited [webmasterworld.com]
Dupe content checker - 302's - Page Jacking - Meta Refreshes [webmasterworld.com]
Can site with a meta refresh hurt our ranking? [webmasterworld.com]
Google's response to: Redirected URL [webmasterworld.com]
Is there a new filter? [webmasterworld.com]
What about those redirects, copies and mirrors? [webmasterworld.com]
PR 7 - 0 and Address Nightmare [webmasterworld.com]
Meta Refresh leads to ... Replacement of the target URL! [webmasterworld.com]
302 redirects showing ultimate domain [webmasterworld.com]
Strange result in allinurl [webmasterworld.com]
Domain name mixup [webmasterworld.com]
Using redirects [webmasterworld.com]
redesigns, redirects, & google -- oh my [webmasterworld.com]
Not sure but I think it is Page Jacking [webmasterworld.com]
Duplicate content - a google bug? [webmasterworld.com]
How to nuke your opposition on Google? [webmasterworld.com] (January 2002 - when Google's treatment of redirects and META refreshes were worse than they are now)

Hijacked website [webmasterworld.com]
Serious help needed: Is there a rewrite solution to 302 hijackings? [webmasterworld.com]
How do you stop meta refresh hijackers? [webmasterworld.com]
Page hijacking: Beta can't handle simple redirects [webmasterworld.com] (MSN)

302 Hijacking solution [webmasterworld.com] (Supporters' Forum)
Location: versus hijacking [webmasterworld.com] (Supporters' Forum)
A way to end PageJacking? [webmasterworld.com] (Supporters' Forum)
Just got google-jacked [webmasterworld.com] (Supporters' Forum)
Our company Lisiting is being redirected [webmasterworld.com]

This thread is for further discussion of problems due to Google's 'canonicalisation' of URLs, when faced with HTTP redirects and HTML META refreshes. Note that each new idea for Google or webmasters to solve or help with this problem should be posted once to the Google 302 Redirect Ideas [webmasterworld.com] thread.

<Extra links added from the excellent post by Claus [webmasterworld.com]. Extra link added thanks to crobb305.>

[edited by: ciml at 11:45 am (utc) on Mar. 28, 2005]



 4:16 am on Apr 19, 2005 (gmt 0)

GG thanks for input - very helpful.

Regarding canonical page identification:

Our site is very large and spread over several domains and we've had serious canonical problems recently (we think we have fixed them with 301s)

Should we consolidate under 1 domain to make it easier to be spidered correctly?


 4:17 am on Apr 19, 2005 (gmt 0)

Yeah, I am confused about the reinclusion request suggestion too. My website is still in the index. Home page there with title and desc. PR7. The internal pages indexed by url only. The Googlebot grabs the index page everyday, and even grabbed an internal page and indexed it with title/desc last week. Yes, the pagerank declined last Sept (to pr0 as I mentioned above), but it increased in Dec.

I went ahead and sent a reincl request. I imagine I will be told the site is indexed, but I guess it wont hurt.



 4:19 am on Apr 19, 2005 (gmt 0)

howiejs - and others - I believe the correct procedure these days to get word to Google for a reinclusion request is to use the form at www.google.com/support/ and use a subject of reinclusion request - GG - please correct me if Im wrong


 5:05 am on Apr 19, 2005 (gmt 0)

howiejs, the addurl form is not the same. Start at google.com/support . We recently revamped our support infrastructure to start more with web forms than with emails. I think the url you want is [google.com...]
and then put "Reinclusion request" in the subject line. Tell your site's name and describe any circumstances that you think might have led to a spam penalty, and why they no longer apply. crobb305, I would follow this procedure too.

arubicus, I know that we've been bringing new tools online to trace spam. joeduck, in general I'd say that using one domain rather than many is a good idea, all other things being equal. There's less chances for mix-ups that way.


 5:10 am on Apr 19, 2005 (gmt 0)

The key problem with every instance of hijacking I've seen is pagerank is ignored in judging the canonical page.

Because you are looking at tool bar page rank and not what they are really using internally. I think tool bar PR doesn't subtract for spam penalties.


 5:10 am on Apr 19, 2005 (gmt 0)

GG thanks for all the info. It's been a long time out for many of us (11 months in my case). I think the patience and insistence of all of us to continue to build content speaks volumes. Re-incl request sent.


 8:14 am on Apr 19, 2005 (gmt 0)


Thanks for your feedback.

It is all well for all the site owners who read webmasterworld to use the suggested form - but I assume that in the medium to long term (hopefully relatively short) the use of this form is not necessary as Google will be able to pick up the disappeared/effected sites.

Not everyone reads WebmasterWorld - they should but they dont.:)


 8:40 am on Apr 19, 2005 (gmt 0)

"Because you are looking at tool bar page rank and not what they are really using internally"

Nonsense. That has nothing to do with it.


 9:01 am on Apr 19, 2005 (gmt 0)

GG - "arubicus, I know that we've been bringing new tools online to trace spam."

Whoo hooo! Google may have converted me to the dark side of the SERPS. Takes a bow :) Just kidding.


 9:40 am on Apr 19, 2005 (gmt 0)

Googleguy - Thats a perfect responce, its nice to hear that you are working on the problem, I also think the most of us know that inurls:search / allinurl:search will includ other domains, if not we will tell them here.

The sad part in my situation is I tried to remove the suplemental results with robots.txt and remove tool, but in that process I forgot a single
User-agent: googlebot
Disallow: /

where there is no specific page after the /, so now the whole site is gone from google, is there anything I can do to get it back within the 90 days, thanks


 12:13 pm on Apr 19, 2005 (gmt 0)

Google Guy

Just a suggestion - wouldn't it be a good idea to have a request form to get the status of a website?

One that will tell the website owner if he has been issued a penalty and why.
Would be nice if one can see that a website linking back has caused a problem or if the owner has a coding mistake.

I am sure Google and other SE's can do this.
I am also pretty sure few would mind paying a small fee for such a service.

The Fee would make it worth while for Google to go through the trouble of checking a website's problem with ranking.

Everyone wins this way - what do you think?



 12:33 pm on Apr 19, 2005 (gmt 0)

wouldn't it be a good idea to have a request form to get the status of a website

It would be easy to abuse the system. First you SEO a site until it falls off the SERPs. Then you use the request form to see what caused the SPAM trigger. You adjust those problems and do a reinclusion request. Trial and error until you have found the optimal settings.

Google tries to keep their ways of detecting SPAM secret. With such a status form it would be very easy for SEOers to disassemble the algorithms and overload the SERPs with junk sites.


 1:56 pm on Apr 19, 2005 (gmt 0)

I'm sure google will know how to deal with that.

Small problem I think.



 2:02 pm on Apr 19, 2005 (gmt 0)

I was talking to a friend of mine about this and we decided to setup a test.

He has 302 linked his site to one of mine.

This will be a good test to see what happens.

Any ideas on link code would be great to be sure we are doing it right.

Any input will be welcome.



 2:04 pm on Apr 19, 2005 (gmt 0)

no, that's a very valid point. People would just keep pushing the envelope until it snapped.

their anti-spam algo would be cracked in a month.


 2:15 pm on Apr 19, 2005 (gmt 0)

Hi GoogleGuy,

Thanks for posting in this thread. We've felt pretty abandoned lately without your input.

I checked my own site on the site command for 302s and they do appear to be gone, however those same URLs are still in the allinurl command and that is ALL that ever appears there (except for links to two of my pages) and I have over a thousand links coming into my site (according to Yahoo).

I manage 27 websites. I checked several that had been hit by these redirects and while the redirects have disappeared from the site command, as you said, the SAME URLS are still in the allinurl command just like they were before.

They may be harmless tracking links but how are we to know this if we're not programmers?


 3:50 pm on Apr 19, 2005 (gmt 0)


You answered "yes" to larryhatch's question "Will the 302-jackers be derated if not penalized?".

Is there now a way to tell a hijacker link from a legitimate redirect link from another site? It's my understanding that some of the current directory scripts use 302 redirects for tracking purposes when linking to other sites. I'm worried that sites using these scripts will now be seen as hijackers and get penalized.


 3:54 pm on Apr 19, 2005 (gmt 0)

GG good to see you back...

"We changed things so that site: won't return results from other sites in the supplemental results."

Can we take that to mean that the url's are really gone from the index now... or simply that we can no longer use the site: search to check for these url's anymore.


 4:50 pm on Apr 19, 2005 (gmt 0)

Just use the inurl: It is the one that told me where the hijackers are. The mention that they should be their is of no consequence. The day that the hijackers showed up in inurl: is the day that my site fell out of the SERP's. It is the second time that I have observed this phenomena. If I were you I would watch the inurl: to catch the hijackers becuase that is where they show up. Although removing them doesn't seem to help I am sure that it can't hurt.


 4:55 pm on Apr 19, 2005 (gmt 0)

Lorel, happy to try to help. I've got a two-hour meeting and then another back-to-back, so it may take me a while to circle back. esllou gave a good answer to vincentg's question.

Again, if you still see any problems please report it at google.com/support with the subject line of "canonicalpage" so that the engineers can investigate remaining reports. If your own site isn't doing that well, consider doing a reinclusion request as well.


 6:06 pm on Apr 19, 2005 (gmt 0)

GoogleGuy, in situations where the cache shows a different domain than the correct one, do you consider this to be the result of hijacking?


 6:50 pm on Apr 19, 2005 (gmt 0)

Hi Googleguy

Glad you are still active in this thread - as your main posting times are after your work people in UK, Europe tend to miss out on some of your more insightfull answers :)

I have some sites which have dropped out of the serps and they have PR0 on the non-www while high PRs on the www - I have redirected all the non-www to the www.

The sites are not getting crawled very well though and I have some www home pages go to url only listings. I have seen you mention that if Google pick the wrong canoical url it can lead to crawling problems. (ages ago I think in Florida or Dom or Esmeralda threads)

I have put the 301 redirect in place from the non-www to the www - should this resolve the problem or should I send a re-inclusion request and advise others who have similar situations to send re-inclusion request aswell?

Dont want to swamp your engineers if it is just a case of patience.


 7:10 pm on Apr 19, 2005 (gmt 0)

GG could you please clarify how to get through with messages about canonical issues. Thanks!

I tried to send via Google Support with "canonical page" in subject line and just got the automatic reply here:

Thank you for your note. This is an automated reply to your inquiry about your site's inclusion in the Google search results.

When webmasters write to us that their site has fallen out of our search results ...


 7:55 pm on Apr 19, 2005 (gmt 0)

I have been having the same problems with split PR and no titles or descriptions on most pages. (no rankings)

(I have done the 301 to the www version 3 weeks ago)

I sent the reinclusion request as GG says and got the same response as above

Thank you for your note. This is an automated reply to your inquiry about your site's inclusion in the Google search results.

When webmasters write to us that their site has fallen out of our search results ...

did anybody else try this and get a different response?


 8:24 pm on Apr 19, 2005 (gmt 0)

GoogleGuy, thanks for your advice.

I had some hijacker's redirects removed a couple of weeks ago. The links and the caches now display a standard error page, without any reference to my site. Can you tell me why they still show up in an inurl:mysite.com search?


 8:34 pm on Apr 19, 2005 (gmt 0)

I think the answer is simply that Google hasn't fixed the underlying problem, they haven't cleaned the site and page data actually in the database itself; instead they have simply elected to remove the obvious wrong entries from the list of results that they show to the public at the time the results are generated. In other words, perhaps they've just painted over the cracks.


 8:54 pm on Apr 19, 2005 (gmt 0)

The only way I can see to get beyond this is to have a randomly generated paragraph at the bottom of each page to avoid the duplicate spam penalty. Kind of tacky way to have the site look though.

For those wondering how long it takes to get back after the dupe penalty is gone my site came back a couple of weeks after I added content to the home page. I had a total of about 45 days out of the search. May not be the same in all cases though.

And oddly enough the 302 redirect url does show up in an inurl command even though doesn't have my domain anywhere in the url.

I'm out of the woods for now but would like to have a way to be sure it doesn't happen again.


 10:13 pm on Apr 19, 2005 (gmt 0)

Dayo_UK, I would go ahead and send a reinclusion request. joeduck/TheET--that's an early response from a computer just to let you know that we got the message. Then that goes into a support queue which someone will check to see if a site had spam penalties and whether it's clean now.

Vec_One, my hunch is that we just haven't recrawled those urls to see that they're a 404 yet. I'd give them a few days to drop out (assuming that trying to fetch the page now gives a true 404 error). g1smd, we have changed our heuristics for 302 redirects, so this isn't just a superficial change. Kirby/Emmett, I'd love to hear details about the sites you mention. If you could submit the sites in question to google.com/support with canonicalpage in the title and include "Kirby" or "Emmett" so that I can recognize it, I'd like to ask someone to check those two cases out.


 10:31 pm on Apr 19, 2005 (gmt 0)

"we have changed our heuristics for 302 redirects"

Thanks, GG that's what I wanted to read.


 10:42 pm on Apr 19, 2005 (gmt 0)

>> remove the obvious wrong entries from the list of results

Imho, that would be a very sound first move, as it immediately removes the worst kind of deceptive redirects (scams, phishing, drive-by-installs, etc.) If they can't pretend to be something else, they won't get anywhere.

However, for most webmasters those "extreme sites" are probably the least worry. In stead it's their own sites that are at stake. Also, a lot of webmasters that do not read WebmasterWorld have been harmed by this as well. These people have no idea what hit them and they wouldn't know what a reinclusion request was if they saw one. Also, they would think it was odd to ask for reinclusion when they've never asked for removal in the first place. Imho, if i told this to my clients they would simply think i had gone crazy.

Even if good new filters/rules have been implemented i don't think we will see a lot of effect on sites that are already hit until an update is done in which those links/URL's are treated significantly differently than they used to be. Perhaps it will weed itself out in the rolling updates, but that will take a long time in which people will continue to experiment with all kinds of stuff, so i've got a feeling that a full update is the best option.

Of course i could be very wrong about this, but i do agree that those links should never have been classified as pages in the first place, so if the root of the problem persists (only hidden this time) it is very likely that the problem will persist as well.


Just read msg #150. Doesn't really change my opinions above, but i should add that i wrote this post before i saw it.

Anyway, the most interesting question to me is: Are sites starting to return? Has anyone seen their sites come back already?


 11:13 pm on Apr 19, 2005 (gmt 0)

Well, a site:www.dmoz.org search has gone from 22 000 scaper pages, last week, to zero a few days ago.

At the same time site:dmoz.org says that there are 11 million results. The real site only has 600 000 categories, and 600 000 Category Charters, and a few thousand informational pages. That makes only 1.2 million real pages. However, yesterday you couldn't get beyond 953 results. Today you can't get past 584 results.

This 467 message thread spans 16 pages: < < 467 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 16 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved