Welcome to WebmasterWorld Guest from 35.173.48.224

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Further Google 302 Redirect Problems

     
2:05 pm on May 9, 2005 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


(Continued from Google's 302 Redirect Problem [webmasterworld.com])


Google victim of redirect too ;):
Search for "Google" and [desktop.google.com...] shows first. If you click, [desktop.google.com...] redirects to Google.com

[edited by: ciml at 4:35 pm (utc) on May 9, 2005]

2:01 am on May 23, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


I have no idea why that happens - maybe the odd time google is pulling your listing from the supplemental index for some reason.
11:36 pm on May 24, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Sept 1, 2003
posts:91
votes: 0


Google gets hijacked themselves:

[google.com...]

11:54 pm on May 24, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3468
votes: 18


I think its so great that google hijacked by there own ignorence on this topic, still manny who has removed other site with the removal tool still is nowhere to be seen in the serps, because of possible dublicate filter.
3:54 am on May 25, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


check out the related links on that one
11:50 am on May 26, 2005 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


They "fixed" that but search again [google.com...] and notice how a 302 link is ranking 3 rd (minus the indent), when plenty of other sites mention adsense in content, title, url etc. It's a 302 link that took people to Adense site. NO content is seen but Google thinks it's the adsense site apparently.

Googleguy has some xplainin to do ;). 302 is not fixed, despite what they said.

12:02 pm on May 26, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1858
votes: 106


Google 302 redirect Problem in Danish (Denmark) press

Comon the daily Danish netnews has brought yesterday 25-05-2005 an interview with the Danish SEO Mikkel deMib Svendsen where he explained the problem and its dimensions in addition to giving few examples to illustrate the consequences of 302 redirect hijacking.

The article is in Danish, but If you can read Swedish or Norwegian then you can enjoy the article too... of course :-)

[comon.dk...]

Enjoy!

3:03 pm on May 26, 2005 (gmt 0)

New User

10+ Year Member

joined:Feb 28, 2005
posts:20
votes: 0


Would using a base href tag solve the 302 redirect hijacking problem?
4:18 pm on May 26, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 2, 2003
posts:113
votes: 0


Sorry but I'm trying to follow this thread and I have a basic question that I must have missed along the way how do you know that having both non-www & www on your site is actually a problem that requires you to do a 301 redirect of non-www to www?
7:03 pm on May 26, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


If you serve the same content both as domain.com and as www.domain.com then you are supplying duplicate content. Google will list one and delist the other, randomly, for each page.

That is, some pages will be listed as belonging to domain.com, whilst other pages will be listed as belonging to www.domain.com. That immediately shows a problem: if www.domain.com/page1.html links to www.domain.com/page2.html but for page 2 it is actually domain.com/page2.html that is listed, then it isn't getting any pagerank from the page1 entry is it?

Listings will be unstable in the SERPs and pages will drop in and out at random. Additionally, many of the pages will show as URL-only listings, rather than fully indexed. All of these things will not be helping your site.

The redirect will fix things. It takes about 4 to 6 weeks or so. It is very easy to set up, especially so if you are using Apache servers. It must be a 301 redirect.

6:20 pm on May 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


how do you know that having both non-www & www on your site is actually a problem that requires you to do a 301 redirect of non-www to www?

Use a server header checker. Type in both versions of your URL, should return 200 or 301. never 302.

Could someone who reads Danish please summarize for us what the google rep said about this?

7:43 pm on May 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1858
votes: 106


Reid

>Could someone who reads Danish please summarize for us what the google rep said about this? <

In short ;)

"You can't always trust Google search engine. A defect in Google makes it possible to manipulate search results and hijack others websites. But Google dismiss the existence of the problem and refuse to do anything about it.
Several Danish firms, which wish to remain anonymous, have experienced that their websites have been hijacked at Google."

In the article, Danish SEO Mikkel deMib Svendsen explained the problem of 302 redirect in the same manner as its explained through the threads at the forums of webmasterworld.com .

8:18 pm on May 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 15, 2003
posts:2412
votes: 5


>> Could someone who reads Danish please summarize for us
>> what the google rep said about this?

Uhm.. I think there's been a slight misunderstanding here. Mikkel is definitely not a Google rep, he's on the other side of the table *lol*

No Google rep has said anything about this, except for GoogleGuy in a Slashdot thread a while ago.

8:35 am on May 28, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


Danish SEO Mikkel deMib Svendsen where he explained the problem and its dimensions in addition to giving few examples to illustrate the consequences of 302 redirect hijacking.

sorry i thought it said CEO.

So who is Mikkel?
bad-google-monetized-the-web?
google-is-watching-you? or
google-is-broken?

8:55 am on May 28, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1858
votes: 106


Reid

>So who is Mikkel? <

Very respected and popular CEO specialist in Denmark. Active on another popular forum writing under his own name. Havent had the pleasure to meet him though he lives only few minutes drive from where I live.

9:54 pm on May 31, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 2, 2003
posts:113
votes: 0


Reid, regarding msg#100 both non-www and www return 200 OK codes on a server header check? So this still leads to few unanswered questions:

When do you know for sure that having both non-www and www is really a problem in google (granted its probably different for each site)?

Has or will google fixed this on their own?

Does google suggest we do this?

Is it a duplicate content penalty issue?

Is it a diluting of PR issue?

Is it a preventative measure to avoid the 302-hijack issue?

Can it hurt you in any way even if you are not affected? I've read that google mis-indexes pages with a 301s?

How will the other big guys treat the 301 changes?

4:19 am on June 1, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


Soquinn
Can it hurt you in any way even if you are not affected?

I know that some webmasters did have their site listed twice (www and non-www) because there was a 302 re-direct. This problem came up when we were using the removal tool to remove 302 hijacks which were showing up right within site: results.
After that a large dot com was dropped from google and it was 'news'. Google made a statement that it should be 301 from non-www to www (or vica versa) and not a 302.
This created a frenzy at WW among webmasters who were being affected by '302 hijacks'. They found that they too had a 302 from non-www to www so they fixed it.
But they still had these duplicate results so instead of waiting for it, they tried removing the non-www version using the google removal tool. They suceeded in removing themselves entirely from google.
Then another 'tweak' came down the pipe and 301's started misbehaving.
For me it was some URL's which I deleted a year before and had been thought to be long dead and gone. I had 301's pointing any stray requests to a similar page.
Well these old pages are suddenly ressurrected in the index, with the 'similar page' as a cache. Duplicate content. I nuked those.
Other webmasters also were commenting on old domains reappearing due to 301's.
So in retrospect I have not encountered any test cases for the non vs www issue, other than watching others go through it at WW. But I would say look, think, use a few different bot sims and look,think again if you are having a problem with this. (with no obvious 302 in sight)
And above all - be very patient because after any tweak you do it could take 6 weeks to see it work.

4:44 am on June 1, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


...and another thing.
The server header checker itself may not be showing the 302 redirect but a 200 after the re-direct, so try a few different header-checkers too.
One tool I would try on this would be a code-browser which will show any redirect. It won't show up on all header checkers.
9:07 pm on June 2, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 2, 2003
posts:113
votes: 0


Reid, check out GoogleGuy's post msg 7:

[webmasterworld.com...]

Looks like the 301 from non-www to www (or vis versa) is recommended for consistency either way and could be interpreted as a fix or preventative measure. He also mentions:

I've been aching for a long time to mention somewhere official that sites shouldn't use "&id=" as a parameter if they want maximal Googlebot crawlage, for example. So many sites use "&id=" with session IDs that Googlebot usually avoids urls with that parameter, but we've only mentioned that here and on a few other places.

This is new to me so Im just wondering if you have a script the uses that format can you change it and if so, to what? Is it just the vernacular of word id or the use why the script itself works?

10:23 pm on June 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


I only use those for affiliates and block googlebot in robots.txt from following them. Seems to crawl my pages with those links on them fine.
11:13 am on June 4, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


I think what he's saying here is that &id= is so commonly used to provide session id's that googlebot will actually flag that string '&id=' and not follow it.
As far as using alternatives they are still dynamic URL's which are ok (but make sure it works because dynamic URL's can make bots buggy) just if you are using scripts which should NOT be indexed as pages then disallow them because if you don't then googlebot will likely call them 'pages'.
9:37 am on June 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3468
votes: 18


Im not sure but as all of you I was also Hijacked and had troubles with the googlebug 302, but the last 3-4 days I see googlebot is back spidering maybe 5-10% of the site, before I only had 5%(130pages) index of my site, now there is 320 pages indexed, I HOPE this could be a start on a reindexing of the site.

What about you have you seen googlebot again or anything els.

3:08 pm on June 5, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 8, 2005
posts:146
votes: 0


Magic... In the last day or so - I have seen all other sites disappear from our listing when using the "allinurl" command. Now it is just back URLs from our site. Hmmn... Has Google just made these sites not appear? Or are the really "gone"? Meaning - they have no affect on rankings or duplicate penalty problems. When I used to look at the cache for these sites - they would show an exact duplicate of our homepage.
5:20 am on June 6, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


Magic... In the last day or so - I have seen all other sites disappear from our listing when using the "allinurl" command. Now it is just back URLs from our site. Hmmn... Has Google just made these sites not appear? Or are the really "gone"?

we are in the middle of an update.
URL only means that google is aware of the page but has not indexed it yet.
If it loses it's description it likely has already been re-crawled and the index is updating. Just wait it out.

Great news Zeus - hopefully this update will be like riding a sunami.

5:54 am on June 6, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Oct 6, 2004
posts:216
votes: 0


Since my sites have been penalized for whatever? I have seen googlebot spider a little here and there but not like it used to before the sites were penalized.

I did have a new site come out of the sand this update. It took about 6 months. This is the first new site I have built to come out of the sandbox so I am happy for that. I do have new sites that are older than 6 months that are still getting nothing from google?

The only thing I did different for this new site is I never promoted it at all. No link campaigne, no swaping links, nothing. I did link to it from one of my PR5 sites from the home page but that's it.

Also I didn't SEO it very well on purpose to see if it would make any difference in ranking. Basically I just looked at the top 10 websites on google for a keyword and wrote down everything the top ranking sites were doing and then I took the most common things and applied them to the new site.

Like most of the top ranked sites had their keyword in bold tags instead of H1 tags like most pages. I also blocked all robots from the site except yahoo and MSN and Googlebot. I don't know if that had anything to do with it.

But I do think my lack of SEO kept my pages ranking low in Yahoo so that I wouldn't end up on thousands of scraper sites causing google to penalize my site for massive link popularity too fast. All of my other new sites are all ranking #1 in Yahoo but still sandboxed in google.

I seriously think google is penalizing for gaining links too fast. I also think they penalize if your site is down when they crawl it. And I also think they penalize if you make site wide changes too. I also think they are penalizing for only paying a year on your domain name, but I can't prove that one.

Google is way too sensitive these days. I think the only way to make it with google is to block all robots except for google and then slowly work on trading links.

9:14 am on June 6, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3468
votes: 18


Reid - well now Im not so sure its great news, today I made a site:mydomain.com then there where only 37 pages all supplemental, then I push the link includ omitted results then it said 681 pages, which WOULD have been good news, but I was only able to see 2 pages and A LOT of those where supplemental results from Nov. 2004, so now Im not realy sure if I should be happy or just go back to my googlebug cave.
9:58 am on June 6, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3468
votes: 18


One thing more it seems like it change from the low count to sometimes the new 320 pages index, which is still very low but better then quit some time
5:39 pm on June 6, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 16, 2004
posts:693
votes: 0


One thing more it seems like it change from the low count to sometimes the new 320 pages index, which is still very low but better then quit some time

You are just picking up different data centers. A little longer and the new better results should propogate to all of them.
7:52 pm on June 6, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3468
votes: 18


Reid - yes I know it different DB, but it just scared me a little because I have not seen, so little indexed pages yet, so it could be the new version of google, but I also have not seen googlebot again for 2 days, so I will go back to my negative mood again, I would have loved to come to WW Conference but I have promised myself, not before my main site is back.
10:05 am on June 7, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3468
votes: 18


Today there is 527 pages indexed, but it looks like the most of the extra pages are pure supplemental pages, WHY dont they update there supplemental database or is that also one of there ways to add "site" to there index, like all those 302 links that is also counted as sites.
11:06 pm on June 13, 2005 (gmt 0)

Junior Member from US 

10+ Year Member

joined:Mar 16, 2004
posts:75
votes: 0


This whole 302 redirect mess has me totally confused.

From everything I've read, it sounds like the 302 exploit hasn't been cleanly addressed. At the same time every outbound link on Yahoo is a 302 and Google is using 302s for every search result on Google.se...

So, what's the current deal with 302s? Is damage still being done by sites using 302s?

This 126 message thread spans 5 pages: 126