Welcome to WebmasterWorld Guest from 220.127.116.11
joined:Dec 29, 2003
joined:Dec 29, 2003
Googleguy has some xplainin to do ;). 302 is not fixed, despite what they said.
Comon the daily Danish netnews has brought yesterday 25-05-2005 an interview with the Danish SEO Mikkel deMib Svendsen where he explained the problem and its dimensions in addition to giving few examples to illustrate the consequences of 302 redirect hijacking.
The article is in Danish, but If you can read Swedish or Norwegian then you can enjoy the article too... of course :-)
That is, some pages will be listed as belonging to domain.com, whilst other pages will be listed as belonging to www.domain.com. That immediately shows a problem: if www.domain.com/page1.html links to www.domain.com/page2.html but for page 2 it is actually domain.com/page2.html that is listed, then it isn't getting any pagerank from the page1 entry is it?
Listings will be unstable in the SERPs and pages will drop in and out at random. Additionally, many of the pages will show as URL-only listings, rather than fully indexed. All of these things will not be helping your site.
The redirect will fix things. It takes about 4 to 6 weeks or so. It is very easy to set up, especially so if you are using Apache servers. It must be a 301 redirect.
how do you know that having both non-www & www on your site is actually a problem that requires you to do a 301 redirect of non-www to www?
Use a server header checker. Type in both versions of your URL, should return 200 or 301. never 302.
Could someone who reads Danish please summarize for us what the google rep said about this?
>Could someone who reads Danish please summarize for us what the google rep said about this? <
In short ;)
"You can't always trust Google search engine. A defect in Google makes it possible to manipulate search results and hijack others websites. But Google dismiss the existence of the problem and refuse to do anything about it.
Several Danish firms, which wish to remain anonymous, have experienced that their websites have been hijacked at Google."
In the article, Danish SEO Mikkel deMib Svendsen explained the problem of 302 redirect in the same manner as its explained through the threads at the forums of webmasterworld.com .
Uhm.. I think there's been a slight misunderstanding here. Mikkel is definitely not a Google rep, he's on the other side of the table *lol*
No Google rep has said anything about this, except for GoogleGuy in a Slashdot thread a while ago.
Danish SEO Mikkel deMib Svendsen where he explained the problem and its dimensions in addition to giving few examples to illustrate the consequences of 302 redirect hijacking.
sorry i thought it said CEO.
So who is Mikkel?
When do you know for sure that having both non-www and www is really a problem in google (granted it’s probably different for each site)?
Has or will google fixed this on their own?
Does google suggest we do this?
Is it a duplicate content penalty issue?
Is it a diluting of PR issue?
Is it a preventative measure to avoid the 302-hijack issue?
Can it hurt you in any way even if you are not affected? I've read that google mis-indexes pages with a 301’s?
How will the other big guys treat the 301 changes?
Can it hurt you in any way even if you are not affected?
I know that some webmasters did have their site listed twice (www and non-www) because there was a 302 re-direct. This problem came up when we were using the removal tool to remove 302 hijacks which were showing up right within site: results.
After that a large dot com was dropped from google and it was 'news'. Google made a statement that it should be 301 from non-www to www (or vica versa) and not a 302.
This created a frenzy at WW among webmasters who were being affected by '302 hijacks'. They found that they too had a 302 from non-www to www so they fixed it.
But they still had these duplicate results so instead of waiting for it, they tried removing the non-www version using the google removal tool. They suceeded in removing themselves entirely from google.
Then another 'tweak' came down the pipe and 301's started misbehaving.
For me it was some URL's which I deleted a year before and had been thought to be long dead and gone. I had 301's pointing any stray requests to a similar page.
Well these old pages are suddenly ressurrected in the index, with the 'similar page' as a cache. Duplicate content. I nuked those.
Other webmasters also were commenting on old domains reappearing due to 301's.
So in retrospect I have not encountered any test cases for the non vs www issue, other than watching others go through it at WW. But I would say look, think, use a few different bot sims and look,think again if you are having a problem with this. (with no obvious 302 in sight)
And above all - be very patient because after any tweak you do it could take 6 weeks to see it work.
Looks like the 301 from non-www to www (or vis versa) is recommended for consistency either way and could be interpreted as a fix or preventative measure. He also mentions:
I've been aching for a long time to mention somewhere official that sites shouldn't use "&id=" as a parameter if they want maximal Googlebot crawlage, for example. So many sites use "&id=" with session IDs that Googlebot usually avoids urls with that parameter, but we've only mentioned that here and on a few other places.
This is new to me so I’m just wondering if you have a script the uses that format can you change it and if so, to what? Is it just the vernacular of word “id” or the use why the script itself works?
What about you have you seen googlebot again or anything els.
Magic... In the last day or so - I have seen all other sites disappear from our listing when using the "allinurl" command. Now it is just back URLs from our site. Hmmn... Has Google just made these sites not appear? Or are the really "gone"?
we are in the middle of an update.
URL only means that google is aware of the page but has not indexed it yet.
If it loses it's description it likely has already been re-crawled and the index is updating. Just wait it out.
Great news Zeus - hopefully this update will be like riding a sunami.
I did have a new site come out of the sand this update. It took about 6 months. This is the first new site I have built to come out of the sandbox so I am happy for that. I do have new sites that are older than 6 months that are still getting nothing from google?
The only thing I did different for this new site is I never promoted it at all. No link campaigne, no swaping links, nothing. I did link to it from one of my PR5 sites from the home page but that's it.
Also I didn't SEO it very well on purpose to see if it would make any difference in ranking. Basically I just looked at the top 10 websites on google for a keyword and wrote down everything the top ranking sites were doing and then I took the most common things and applied them to the new site.
Like most of the top ranked sites had their keyword in bold tags instead of H1 tags like most pages. I also blocked all robots from the site except yahoo and MSN and Googlebot. I don't know if that had anything to do with it.
But I do think my lack of SEO kept my pages ranking low in Yahoo so that I wouldn't end up on thousands of scraper sites causing google to penalize my site for massive link popularity too fast. All of my other new sites are all ranking #1 in Yahoo but still sandboxed in google.
I seriously think google is penalizing for gaining links too fast. I also think they penalize if your site is down when they crawl it. And I also think they penalize if you make site wide changes too. I also think they are penalizing for only paying a year on your domain name, but I can't prove that one.
Google is way too sensitive these days. I think the only way to make it with google is to block all robots except for google and then slowly work on trading links.
One thing more it seems like it change from the low count to sometimes the new 320 pages index, which is still very low but better then quit some time
From everything I've read, it sounds like the 302 exploit hasn't been cleanly addressed. At the same time every outbound link on Yahoo is a 302 and Google is using 302s for every search result on Google.se...
So, what's the current deal with 302s? Is damage still being done by sites using 302s?