| 11:04 am on May 26, 2005 (gmt 0)|
The usual wisdom is yes remove the non www, however there have been people reporting problems with google's handling of 301's recently.
I'd love to hear what experts say.
| 5:04 pm on May 26, 2005 (gmt 0)|
You can have problems with duplicate content if you allow Google to have both versions in the index.
But don't try to remove wrong version with URL Console - it removes both versions for six months.
Right solution is to redirect with 301 from one version to another, and Google usually merges it well, but it takes it from some days to a few weeks. It's essential to leave at least one inbound to wrong URL version to ensure Google will crawl the redirect, otherwise wrong version stays in the index for very long time.
| 5:36 pm on May 27, 2005 (gmt 0)|
bwprice is asking whether he should stop his server from accepting requests without the normal 'www' before the domain.
Most people use www.example.com with the prefix there when they type a domain name in obviously. But most web savvy people just type in the domain without the 'www' beforehand, and when I type in a domain without the prefix it does annoy me when I can't access the site and have to revert to typing out 'www' for the billionth time.
I don't know of any bad affects of using the domain name without the 'www', so I would advise you to let your visitors access your site with it.
|Smashing Young Man|
| 9:04 pm on May 27, 2005 (gmt 0)|
I 301 to the non-www version of my site. What is the point in the "www" anyway? What purpose does it serve except to add an extra bit of typing to the URL? I guess there is a bit of web history behind that somewhere. Anyone care to enlighten me? :)
| 9:11 pm on May 27, 2005 (gmt 0)|
|I guess there is a bit of web history behind that somewhere. |
The "www" was conventionally used to indicate the *web* access point of a server (www.example.com), as opposed to the ftp access point (ftp.example.com), the gopher access point (gopher.example.com), or the access points any of a number of other possible services.
| 5:00 am on May 28, 2005 (gmt 0)|
Or there could be separate servers for the different services. If you use different domain names for each service, you can put them on the same server, or on separate individual servers, or you can group them on servers in any number of ways, and it's all "looks" the same to whoever is accessing those services. When you move services from one server to another, just change your DNS configuration.
www.example.com (Web service)
mail.example.com (e-mail service)
news.example.com (Usenet newsgroups)
ftp.example.com (FTP service)
ns.example.com (DNS name service)
| 11:40 am on May 28, 2005 (gmt 0)|
I have a site that is indexed by google with and without www. There were a couple of links pointing to the one without www, but I had them all changed, so they now point to the www-version.
Is there any potential danger if I don't do a redirect? will the other version eventually disappear from the index? At the moment the index of the other version only shows the title but no description and the PR is 0.
Reason for asking is that I basically don't know how to do a redirect and it would probably take me ages to work out how to do it (site is on an NT server).
| 12:29 pm on May 28, 2005 (gmt 0)|
I would look at your server config file to see how your server handles the requests - should be a servername entry that either uses the www or without the www - if it is set to the www version then apache issues a 301 for you if it gets a request for the "without" version - assuming of course that you are using an apache loaded server.
Right now Google has been handling the domain.com and the www.domain.com as two separate sites so if you are using relative directory/page links you will end up with two duplicated sites in Googles eyes
| 1:46 pm on May 28, 2005 (gmt 0)|
I've always had the non-www, with no problems at all. Occasionally, a www-version of a page appears from somewhere, because others often add www to the link, but Google doesn't seem to care.
On balance, I think adding www, which has more syllables than worldwide web, merely wastes talking time.
I'm pretty sure it's redundant, like the l in html
| 3:41 pm on May 28, 2005 (gmt 0)|
Thanks Marval & Brian. My host told me that for the NT server I would have to do an asp-redirect (doesn't mean anything to me).
Marval: Do you mean I should change my internal links somehow (full URL?) and that would stop Google from seeing two sites?
| 1:22 am on May 29, 2005 (gmt 0)|
|Do you mean I should change my internal links somehow (full URL?) and that would stop Google from seeing two sites? |
It won't hurt and might help. The terminology involved is "relative" links, those without the http:// on the front of the <a href=""> stuff, and "absolute" links, meaning those with the full URL as the anchor.
My own thoughts on this, derived from problems I had in the past, are that if a SE bot comes in on the non-preferred version even once, (such as example.com rather than www.example.com), because of an incoming link in the wrong form, it can follow the relative links right through the entire site, (after caching links and getting back to them later etc), and think there are two versions of every page. It only needs to hit one page on the site at first to cause this confusion, then it merrily indexes all of your pages for a second time. Although G, and Y, and all of them should be smart enough to figure things out, it sometimes results in problems. It's best to use absolute internal linking and not have to worry, because any non-www page that gets crawled directs the bots via internal navigation to the www versions of the linked-to pages. Even better is using Apache and taking care of things with .htaccess, (with a mod rewrite), thereby making sure all requests for the wrong URL's are directed to the right URL versions right off the bat.
| 5:30 am on May 29, 2005 (gmt 0)|
Stefan - I agree as I have experienced (and had quite a few friends with similar occurances) the bot picking up the non www version and indexing the relative links (of course I also had vanity domains picked up this way when Yahoo dropped Google and instituted their old index with some old vanity domains still in there) as dupe pages and eventually as dupe sites with split rankings.
I have just recovered from that - took almost 5 months to get it all sorted out on one domain - but my rankings have dome nothing but climb to new heights since I fixed all of the sites.
The question on Windows servers has been covered a few times here in another forum - [webmasterworld.com...] is one of the msgs I found real quick
| 8:55 am on May 29, 2005 (gmt 0)|
Thanks again! I think I will start with changing my links then. I'm not confident that I would be able to work the redirect scenario out properly. Also the pages are all in html, not asp, and I have read it might not be a good idea to change from html to asp. Hopefully changing the links will be enough to stop Google from penalising my site........
| 8:43 pm on May 30, 2005 (gmt 0)|
You do need to set up the redirect. It must be a 301 redirect. I would redirect from non-www to www where possible. Use the Xenu LinkSleuth program to check all the links on your site too.
I haven't the time to post a long answer right now; but this question is asked several times per week, and there is much prior art in earlier threads...
| 9:49 pm on May 30, 2005 (gmt 0)|
>> this question is asked several times per week
I'd just like to add that it really doesn't matter for your ranking if your keep the "www." or if you remove it, using only "domain.com". For that matter you could use "zzz.domain.com" if you wanted to.
Use whatever you like best, just pick one and stay with that one. Consistency is key.
Oh, and let me repeat: It must be a 301.
| 10:23 am on May 31, 2005 (gmt 0)|
Well, I have now read a lot about this redirect and to be honest, I don't have the basic knowledge to understand half of it, let alone set it up.
Fact is that nobody seems to ever visit my site through "domain.com", so to come back to the original question of this thread: Can I somehow stop Google (and everyone else) to access the site through "domain.com"? I know there are sites that don't load if you don't use the "www".
Is it something the host would have to do?
| 5:48 pm on May 31, 2005 (gmt 0)|
joergnw10, your host can probably set that up for you. However, as www is a subdomain of your domain it is not really a natural thing to do, plus it will be better to at least offer the lazy people out there an option to go to the right domain in stead of indicating that you have an error in your setup.
If you use an Apache server, doing this the right way is less than a one-minute job to set up properly, and it costs you nothing.
Still, being the master of your own domain you should decide about that, and if you've already made up your mind, I'm sure you host can help.
| 6:43 pm on May 31, 2005 (gmt 0)|
Google just does seem to be good at cleaning up the www vs non-www issue. Even when redirected, page count does not get corrected.
Would it be possible to 404 anything that comes through on a non-www address? I think google is better at cleaning up pages that no longer exist.
Is this a good idea?
| 7:59 pm on May 31, 2005 (gmt 0)|
Install the 301 redirect. It takes a few months for Google to totally clean up the listings.
I helped a site clean up by making a "fake sitemap" which we put on another site. That sitemap only listed the URLs that we wanted removed (these URLs were the starting point of the redirect) and it took a few weeks for that to happen.
As an aside, two weeks ago, Google suddenly relisted all four versions of the URL for every page of the site. The non-www with trailing / are all fully indexed with title and description. The other three versions (non-www without trailing /, and www with trailing /, and www without trailing / on the URL) are all shown URL-only entries. Also included are some URL-only entries for the pages that were removed 3 months ago, and a page that hasn't existed for 18 months.
This happened just as the latest update started. It looks like Google has broken the way they handle 301 redirects, at the same time that they haven't fixed the 302 redirect URL-hijacking problem. Maybe they just need to respider the whole web to collect the new redirect-elsewhere/unique-content/duplicate-content status of every URL in their index. If that is so, then get that 301 redirect installed right away. It cannot harm you, and can only help.
| 8:11 pm on May 31, 2005 (gmt 0)|
Juat saw that you were on a NT server (ie. not Apache). In this case my advice don't apply. Sorry about that.
What still applies, though, is that it must be a 301.
AFAIK, on MS servers (IIS, NT) either you can set up the redirect in the Control Panel, or you have to purchase/install an ISAPI filter.
Don't use the method "response.redirect" from ASP. That one returns a 302 server status code, not a 301. A 302 is the root of all evil. That one is more or less guaranteed to bring you trouble.
| 8:35 am on Jun 2, 2005 (gmt 0)|
What Wizard said (and what claus just said).