| 1:56 pm on Oct 25, 2005 (gmt 0)|
One small point about using 301, if you are making other changes to the site you may wish to wait untill those changes have been spidered before you add the 301. We had a problem with duplicate content for a .com and .co.uk site, we lost about half our google referrers over a two week period with the .com pages just disappearing. We decided to change all the URLs in the site to absolute URLs using the .com domain name; we also added a 301 from the .co.uk to the .com and we set up a little script to monitor what domains google was spidering. To our chagrin we found the googlebots were only spidering the .co.uk domain and not picking up the new pages at all, they were just getting the redirect URLs and leaving. To make a long story short we took out the 301 and over the next two weeks google respidered all the pages (>50,000) and we completly recovered in the index. Now we can put the 301 back and hopefully the problem will not recur.
| 2:24 pm on Oct 25, 2005 (gmt 0)|
I appreciate your sharing your problems, but my question is if what I am seeing (no backlinks for non-www URLs) is the norm now? Or does this refelct a problem of some sort?
As I said, the 301 redirects have been in place for a very long time and previously Google saw both www and non-www as the same (ie showed the same backlinks for both)
| 4:09 pm on Oct 25, 2005 (gmt 0)|
Tomapple, just realized I have the same thing but I don't know if it‚Äôs a problem. I think the only that could answer this is Googleguy or Google themselves.
301 update for me,
My site is now showing a PR5 for the homepage and many interior pages but as of today still not appearing in the SERPS.
The old domain is still in the supplement index.
Index count is 13,300
Gbot is putting the smack down on me a couple of times a day.
My HTTPS pages are showing a grey PR bar.
My allinanchor searchs are mostly in the top 10 (what does this mean anyways‚Ä¶ is this supposed to represent your actual position?).
My only Google traffic is from images!
| 4:42 pm on Oct 25, 2005 (gmt 0)|
Modemmike, thanks for the insight regarding your 301 issues.
All of the sites with 301s are still showing in the SERPs. With traffic being more or less the same as it was 2 weeks ago (pre Jagger). Only one of the sites has seen a reduction in Google traffic (down about 43% from pre Jagger).
At this time I am just trying to determine why the non-www URLs no longer show any backlinks (and of course if this issue is going to result in further problems down the road).
| 4:46 pm on Oct 25, 2005 (gmt 0)|
Heres a question for ya:
Say you have a high ranking website with thousands of backlinks, and you buy out another website/company/take possession of it. Would having that site 301 redirect to your high ranking website cause problems or even the sandbox again!?! Maybe it would be better to just say our new link is at: and the link to your high ranking website? Google needs to fix this 301 problem.
| 5:44 pm on Oct 25, 2005 (gmt 0)|
How do you recommend setting the 301. I set up some 301s on a domainn page per page re direct for sites we wanted to get out of the google and yahoo index and replaced with our main site. But we have smaller domains that never performed well with no ranking. Can I just place a 301 on the entire domain and redirect it to a relevant section index? The sites have no real importance and I want to get it done without any possible penalites. What is safer? Page redirects or domain wide?
| 8:24 pm on Oct 25, 2005 (gmt 0)|
pteam, Google is working on new 301 code according to Matt Cutts blog:
|we may be due to replace the code that handles that in the next couple months or so. If it‚Äôs really easy for you to wait a couple months or so, you may want to do that; it‚Äôs always easier to ask crawl/index folks to examine newer code than code that will be turned off in a while. |
I would not risk another redirect for any reason so I would just place a link to the new site and 404 all the pages.
marty98, if you are running ASP I might have some code for you on my personal site that would be of help... sticky mail me for the URL. To answer you question however I don't think either method is safer than the other but consider 301'ing a single page at a time would only effect that page. Again, it appears Google is working on new 301 code so Iwould wait until that is in place before doing anything else.
| 1:40 pm on Oct 26, 2005 (gmt 0)|
In a previous post (inserted below) I mentioned that the 301 redirects we have in place are pointing to http://www.example.com versus http://www.example.com/
Is it possible that the lack of the "/" could be the problem?
Here is my situation.
Several long exisiting domains with 301 redirects from the non-www URLs to the www URLs (redirects have been in place for about 2 years).
Backlink checks for www and non-www URL used to always show the same number.
Now a [link:example.com] for all of them show zero (0)backlinks.
They are all hosted on IIS servers and the 301 redirects are to http://www.example.com and not to http://www.example.com/
Any insight into this?
| 6:47 pm on Oct 26, 2005 (gmt 0)|
modemmike said: I have always speculated if .html got better ranking... your post seems to confirm that... with IIS you can make .html page process as ASP so I wonder if it's worth the trouble.
Our programmer wants to use ASP on our retail site product pages, rather than the existing html, mainly as he wants to handle basket function without the need for a cookie (there are some other reasons, but I don't have a great understanding of prgramming issues). But I do know it is very disruptive to Google results when page URLs change.
Would appreciate it if you could expand on your above comment, or point me in the right direction for getting more info.
| 10:04 pm on Oct 26, 2005 (gmt 0)|
With IIS you right click the domain name, click properties, click the home directory tab, click configuration, click add, the executable is C:\WINDOWS\system32\inetsrv\asp.dll, the extension is .htm or .html or add both... this should force IIS to process .htm files as ASP. I will sticky you a link to my personal site that has screen captures of the process.
With Apache using .htaccess file you can use:
AddType application/x-httpd-asp .html
AddType application/x-httpd-asp .htm
I haven't tested these examples fully but this will get you on the right track.
Some control panels will also let you do this sort of thing... I believe it's considered mime types...
Let us know how it works out for you...
| 10:11 pm on Oct 26, 2005 (gmt 0)|
Thanks very much for that - will let you know how it goes.
| 1:44 pm on Oct 27, 2005 (gmt 0)|
Has anyone had problems with 301 redirects going to http://www.example.com (no /) as opposed to [www,example.com...] (with /) impacting back links?
| 8:52 pm on Oct 27, 2005 (gmt 0)|
Would really appreciate some feedback on my 301 redirect issue. We are not moving domain names, not using sub domains, not dealing with nonwww redirects. What we are doing is launching a brand new site. The whole structure of the site is staying the same. The only change is that the URLS will be going from /this_is_the_page.php to /this-is-the-page.php
From my experience I would say that this should go over pretty easily if we just 301 the old underscore pages to the new matching hyphen pages.
Can anyone say if they see a problem with this? Or how they would do it. The site has a very high PR and I just think that if we let the old 404 pages just die out and wait for the new pages with hyphens to index that we could really set ourselves back.
Let me know - thanks
| 9:22 am on Nov 5, 2005 (gmt 0)|
4 months down the track?
All google rankings back after 301ing from .com to .com.au.
incoming links not pointed at new site, they are still pointing at the old site.
| 10:07 pm on Nov 5, 2005 (gmt 0)|
Check [184.108.40.206 ] for latest Jagger3 results.
I don't see any fixes for www vs. non-www and for the ancient supplemental results beyond listing fully indexed pages first and URL-only listings last.
| 4:37 am on Nov 6, 2005 (gmt 0)|
I'm not sure if this is the same situation, but I have 20 or 30 pages that are now listed in my .htaccess file where I used underscores and changed from /gift_baskets_canada_basketname.asp to /gift-baskets-canada-basketname.asp by using a 301 redirect.
It's a 5 year old site with good rankings...am I risking my serps by doing this?
Btw, how long do I need to keep a 301 redirect in my .htaccess before I can remove it? 99% of my IBL's are to my domain (http://www.grenvillestation.com/) and not subpages. I also use google sitemaps regularly.
| 8:17 am on Nov 6, 2005 (gmt 0)|
You keep it running until no search engine lists the "wrong" URL, and (by looking at your logs) you see no visitors arriving there from bookmarks or from other sites.
| 12:28 pm on Nov 6, 2005 (gmt 0)|
I guess if you were acquiring ibl's for interior pages, then you'd have to deal with getting that changed as well...
is 20 or 30 redirect 301's alot? what's the most anyone's ever seen (without it being a complete site change to another domain).
| 3:05 pm on Nov 6, 2005 (gmt 0)|
Most of you are talking about 301 redir from old domain to new domain...
what about using 301 redir for redirecting old pages (that are still indexed) to new pages with simular content?
is it better to do so or should I delete them and use 404 or maybe just leave it as it is?
Thanks in advance,
| 5:45 pm on Nov 6, 2005 (gmt 0)|
If the "old" pages are listed in search engines, have incoming links, and attract visitors then use the 301 redirect to get the page out of the search results, while still channeling visitors to the correct page for all of the time that you still have entries in search engine results and links from other sites. When the search engine results have indexed the new pages and you have updated all your incoming links to point to the new pages, check to see if any visitors still arrive via the redirect. As soon as there are none, you can remove the redirect.
| 7:19 am on Nov 7, 2005 (gmt 0)|
Thank you for your answer g1smd,
Is that correct for all search engines? G/Y/MSN?
| 8:19 pm on Nov 7, 2005 (gmt 0)|
Yes. Keep the redirect in place until all search engines list the new URL, and no real visitors are coming in via the old URL.
| 8:46 pm on Nov 7, 2005 (gmt 0)|
Is it ok to have 50 redirects? What about 1000? From an SE perspective, does having too many at some point negate the benefit of the redirect?
| 8:51 pm on Nov 7, 2005 (gmt 0)|
Server performance may start to slow, if you have a massive list of redirects.
I have no idea as to how many might be too many - I haven't seen anyone ever post to say that they had that particular problem.
| 12:56 am on Nov 8, 2005 (gmt 0)|
Today I had my service provider do a 301 permanent redirect for iis from [domainname.com...] to [domainname.com...] to avoid the duplicate content issue.
I just checked the server headers using:
and it is returning this:
#1 Server Response: [domainname.com...]
HTTP Status Code: HTTP/1.1 301 Error
#2 Server Response: [domainname.com...]
HTTP Status Code: HTTP/1.1 200 OK
Is the #1 Server Response okay ie: "301 Error"
Should it not be:
#1 Server Response: [domainname.com...]
HTTP Status Code: HTTP/1.1 301 Moved Permanently
As you can ses from the #2 Server Response, the redirect is working okay. (200 OK)
Thanks for your help!
| 6:17 am on Nov 8, 2005 (gmt 0)|
bwprice, yes it should read "HTTP/1.1 301 Moved Permanently". I used to do the redirect via my server in IIS but moved to a code approach and you may want to as well. I can sticky you some ASP or do a search for whatever favor of code you are using.
| 11:47 am on Nov 9, 2005 (gmt 0)|
I'm having a similar problem to tomapple.
I have a very well established site, rankings were fine; noticed I had [site.com...] and [site.com...] home pages listed in Google.
I set up a 301 to redirect http:// to [www...] on Nov. 6. I used some code in another thread on Jagger as I was under the impression this "double" listing could indicate trouble ahead if I didn't fix it.
The redirect is working fine, correct header responses etc, but I'm now noticing:
is returning 0
returns the usual number of backlinks.
Added to this, when I search for my site name (not domain), I'm presented with my DMOZ title on the link and the DMOZ description, then one of my sub-pages underneath that (with correct page title and description).
If I search on my domain name, I only get the http:// listing - no fresh tag either. This is happening on most DC's.
Rankings have plummeted for most terms. For those of you who've done this type of redirect before, should I reverse the htaccess fix, or is this normal behavior and I should just wait it out? Is it the same sort of wait as on a usual single page redirect?
I've performed 301's before for single pages, but this is a different scenario, behaving totally differently and I'm a little concerned. I wasn't expecting my [site.com...] listing to get buried/virtually disappear, just to get rid of the [site.com...] listing. Any advice would be appreciated.
BTW, Googlebot is very active on the site at the moment.
| 12:31 pm on Nov 9, 2005 (gmt 0)|
Having done this redirect for many sites over the years, I find that Google usually correctly lists the www pages within a week or so, any URL-only ones quickly picking up the title and description, but Google takes much longer to phase out the old non-www results (and never gets rid of any supplemental pages).
Try and find any major incoming links that point to the wrong version and get them amended.
Make sure that your internal linking points to the correct one. Add the <base> tag if necessary - especially to the root index page.
If you link to a folder anywhere on your site, make sure that the URL ends with a trailing / on the link. This is VERY important. Check your site with Xenu LinkSleuth and look for any irregularities.
I would suggest to see where you are at the end of the month...
| 1:24 pm on Nov 9, 2005 (gmt 0)|
Thanks for your response. So a short term general site-wide ranking drop is not unusual when applying the .htaccess redirect code you recommended for http:// to [www;...] even on a well established site?
| 1:53 pm on Nov 9, 2005 (gmt 0)|
--and never gets rid of any supplemental pages
you right on target about that one. I have Part Number for my widgets that is unique to my site(well unless the page has been scraped so it shows on scraper site)
if I do a search Google for that Part number: results are shown in the following order
1. Listing for a page that used to list the Item, now this page was 301nd in march, and Googlebot had revisited the page at lease 7 times already, shows as a suplemental page.
2. Page that had this part number, later removed. This page was removed from the index using Removal Tool in February due to a duplicate content, and after comming back in to index as Suplemental has been revisited twice already where GBot got 410 code.
The URI is Still there.
3. message says that: 'In order to show you the most relevant results, we have omitted some entries very similar to the 2 already displayed.
If you like, you can repeat the search with the omitted results included.'
Now if I expand the message:
First 2 stay the same.
then goes #3 as a non WWW version of the page with cached date of LAST NOVEMBER. page returns 301 for the past 9 month and has been revisited at least 3 times by GBot
4. A version of the same page cached date in JUne.
problem with this one is that there was a link to this page from a scraper site with 4Ws like htt*://wwww.mysite.tld/mypage.cfm - suplemendal result. if you click on the page the page returns 410.
and then Finaly my beloved widget page as a current with a current cached date couple of days ago.
Notice: Pre Jager this page ranked #1 out of 2Million or so, where now its at #80 or so.
Interesting, Should I stay or should I Go?
| 2:39 pm on Nov 9, 2005 (gmt 0)|
Hmm, same old story: you added the redirects in March, and you see stuff cached from the previous November. I see that a lot.
The way forward? Google to clean up their supplemental index (post #400 [webmasterworld.com]), or in the continuing absense of that for you to add the <base href="http://www.domain.com/"> tag to all of your pages.
If you use relative linking the <base> tag will need to contain the full URL for the page, not just the domain.
| This 246 message thread spans 9 pages: < < 246 ( 1 2 3 4 5 6  8 9 ) > > |