Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Site Disappeared After 301 Redirect

301 redirect question

         

robdawg

9:10 pm on Jun 12, 2005 (gmt 0)

10+ Year Member



We run resin and we had to write a servlet to redirect the non www to the www so I don't know if it returns a 301. If you type in the non www then is displays the www version but what I can't figure out is why the domain totally dissapeard. Is that normal? The site still has cache in google and the spider came yesterday. So should I just wait it out? Any ideas? I've used webbug and everything returns the 301 and 200 correctly. I'm stumped!

robdawg

6:18 pm on Jun 13, 2005 (gmt 0)

10+ Year Member



Any Help would be great!

bumpski

12:38 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm afraid I don't have any help, but I recently put in the 301 redirect as well and coincidently most of my pages have gone URL only ( 80% at one point, now 70 % URL only, or "partially indexed" ). I saw no evidence of Googlebot ever seeing a 301 redirect response, but I do know the code worked.
So many people are reporting site URL only problems that so far I'm going with coincidence.
If you site's URL's in Google were not correct, or as desired, I could see how you could have a big problem. In my case Google showed all my pages as "www." which is what I desired and redirected to, so it should have had little impact. My home or root page was the only page appearing at both the WWW and non-WWW URL; pretty important page and both copies are still indexed at both addresses after many crawls.
Unfortunately the default logging for most web servers doesn't show the domain! This might be changeable in .htaccess, but it would mess up statistics reports.

I think the last time Google crawled my site with a non-www URL was April 20th, oddly enough I see evidence of this same quirk on another of my sites, I'm wondering what happened on or about April 20th.

Hope this helped a little.

robdawg

3:20 am on Jun 17, 2005 (gmt 0)

10+ Year Member



Anyone with any ideas?

Stefan

3:29 am on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Perhaps your most valuable incoming links were to the non-www version?

Using the www is only because this seems to be the version that most users are familar with - one could redirect to the non-www version just the same, if this is what most of the incoming links are to. All these threads on 301's lately, and the need for a www, have maybe got some people a little confused on the reasons for it. One just wants consistency; there's no inherit need for a www subdomain.

Have a look at your backlinks, with special attention to the PR of the page they're coming from, and the URL version. If the best ones are without www, then reconsider your redirect choice.

jdMorgan

3:36 am on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you install a 301 redirect, be sure to check it here [webmasterworld.com]. If it's returning anything but a 301, delete it!

The same goes for 403, 404, and 410 responses using custom ErrorDocuments -- It's a good idea to check them.

I second what Stefan mentioned. Don't rely soley on a 301; try to get as many incoming links corrected as possible.

Jim

digitalv

3:38 am on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So does PR forward with a 301?

SEOtop10

3:57 am on Jun 17, 2005 (gmt 0)

10+ Year Member



Yes, '301 redirect' forwards the value transferred with a link.

I strongly recommend that you should check both the www and non-www URLs with the server header check utility. A client recently setup 301 redirect for non-www successfully. However his www version was now having a 302 redirect which is a fix worse than the problem itself.

robdawg

5:11 am on Jun 17, 2005 (gmt 0)

10+ Year Member



Thanks Guys. After we instituted the 301 3 days later our site totally disappeared from the serps. It can't be found at all no site:www nothing we still have cache in the google toolbar but I can't figure out why we can't be found I hope we're not banned. We did the 301 cause google showed 3x the pages we actually had with lots of url only links. We were told the best way to fix it is 301. Now we're gone.

jdMorgan

5:31 am on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This is one of the few situations where the "Submit your site to Google" page is useful. Step back, evaluate your incoming links as described above, check that the correct server headers are being returned, and get some of those inbound links updated. Then submit your main URL to Google once.

The best time to add a non-www to www redirect is before the site goes live. Combined with inconsistent incoming links or (even worse) inconsistent internal linking, there is potential for problems like this. Unless you've used a lot of questionable SEO on your site, this situtation should resolve itself, but it will take time. Don't reverse course, or that time will be extended. If this was the right decision for the long term, then the payoff will be worth the wait.

I suffered the loss of 50% of my traffic for several months because of changes like this on one site. Now the number of visits I lost during that time period amounts to only one or two percent of its current traffic levels. So, it's a matter of perspective.

Jim

robdawg

6:21 am on Jun 17, 2005 (gmt 0)

10+ Year Member



I'll try to review as many of the incomming links as possible but we currently have 22,000 according to yahoo and 4,000 by msn. Is it normal for a site to just disappear after 4 years of steady rankings. We are no longer in googles index. We are a top 5 company in our industry in the us and canada so I can't see how come we got killed like this.<snip>
Rob

[edited by: lawman at 4:59 pm (utc) on June 17, 2005]

roldar

6:33 am on Jun 17, 2005 (gmt 0)

10+ Year Member



I don't know if the site in question is that domain of your profile email address; but half the links on that particular site say "Template Error...text2_sitemap.html does not exist. Halted."

I would imagine on such a site that those would be your SE fodder, since spiders can't use search forms.

sit2510

12:01 pm on Jun 17, 2005 (gmt 0)

10+ Year Member



>>> After we instituted the 301 3 days later our site totally disappeared from the serps. It can't be found at all no site:www

I have a look into site:www.yourdomain.net (in your e-mail profile); there are over 69,000 pages indexed by Google. Just 3 days after you had 301 is too short to be the issue. IMHO, it is an coincidence.

bumpski

12:28 pm on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Of course almost 20,000 of the pages are URL only. Could be normal. But the URL only pages may still be in the SERPS for Title content, but not page Content.

This is something new I think I'm seeing on my sites; URL only pages that don't show a title in SERPS but seem to still show up in the SERPs for keywords in the Title.

I have a unique string in all my titles and nowadays some URL only listings are shown even when I search for the unique string in the Title. Just an odd tidbit.

surfer67

1:54 pm on Jun 17, 2005 (gmt 0)

10+ Year Member



I have the same problem. When I do "site:mydomain.com", most of my pages show up with just the url. There is no title or description showing. If I do a google search for keywords in the title, the url shows up in the search results with no title or description. This has been going on since April and my traffic has been steadily declining to the point where I have now lost approx. 40% of my google traffic.

Any ideas?

robdawg

3:25 pm on Jun 17, 2005 (gmt 0)

10+ Year Member



Hey Surfer thanks for the reply. I don't have a clue why the URL only is a problem with g right now.

surfer67

3:32 pm on Jun 17, 2005 (gmt 0)

10+ Year Member



Neither do I. My site is 5 years old and has always ranked well with Google.

Do you suppose it has anything to do with server downtime? Although I have an excellent web host, a couple of months ago they experienced a network outage that lasted about 20 hours. I wonder if this had an impact?

bumpski

4:09 pm on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Could it be your jsvt subdomain looks like it's cloaked? Not that it is.

As good a time as any to implement a Google "sitemap.xml" file. Some have claimed their site returned from oblivion shortly thereafter. (Didn't seem to help me though still 45% URL only)

Reid

4:25 pm on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



after changing low-level settings and the site goes URL only. I would guess that google just needs to respider the site. After that it should rebound to it's former glory. This could take some time though - especially if it's dynamic.

robdawg

4:34 pm on Jun 17, 2005 (gmt 0)

10+ Year Member



Actually the site can't be found at all delisted from G after the 301 was instituted. But I digress. The JSVT subdomain hosts our virtual tours so I don't believe that is a problem.

Reid

4:35 pm on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



robdawg - i found 2 serious problems with that snooping site.

1. META robots revisit after 30 days.. This could be slowing down the process of re-indexing.

2. Serious errors in the WC3 validator... messed up META tags.

theBear

4:40 pm on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



And there is no working 301 redirect at this time. Both forms return a status 200 when run through a header checker.

In addition the ip addy is open and returning a 404 but that may not be a problem. You might want to run the ip addy through a header checker there may be a 302 involved depending on how things are setup.

Good luck.

robdawg

5:09 pm on Jun 17, 2005 (gmt 0)

10+ Year Member



The Head of our IT dept. just told me that the 301 was taken off to see if any changes hapen for a week or so. We don't run apache so we want to make sure we do it right so that we don't cause further damage.

surfer67

5:16 pm on Jun 17, 2005 (gmt 0)

10+ Year Member



Only my dynamic pages, ie. pages such as "page.asp?ID=38", are url only. I wonder if this may have had an impact?

Anyhow, I did do an ISAPI rewrite on all my dynamic pages which now have a "page-38.asp" format. I'm hoping this will resolve the issue. I do have over 3000 pages, so this may take some time.

Reid

8:25 pm on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



surfer - i think the second format is better (no paremeters). Just check those pages with the WW header checker, robot sim, HTML validator (for header errors, doctype etc) and see if googlebot is requesting them, if it keeps asking for the same one(s) repeatedly then it can't crawl them, that should narrow the problem down if not fix it.

In the headers there are a few things to look at.
date: same as last time you uploaded it
any robot directions
code 200 , 404 ect.
base: location url of the file

these are common areas that I pay a lot of attention to for siderability (is that a new word?).
Not to say that there couldn't be other problems showing up in the header info.
I like that poodle predictor tool because it shows the complete header - including Meta - all in one page. But for return code (or other problems) you might want to try WW header checker just to make sure you are not bypassing something with poodlebot (redirect). Im not sure how foolproof it is for that.

it'd be good to hear a little tutorial from someone who knows how to read header info for other possible spiderability problems you could locate from that data. Other than what I mentioned already.

Reid

8:36 pm on Jun 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



and I forgot one thing to check - robots.txt

if you dont have one then it's not a spiderability issue.
validate it - WW tool
look at it with "the file(s) that cant be crawled" in mind and see if there is any reason that they are being disallowed.