Forum Moderators: Robert Charlton & goodroi
The advantages:
The disadvantages:
Should I be redirecting incorrectly cased URLs to a lowercase URL?
If you have the proper facilities in place for redirection, whether you're using PascalCasing in the domain itself and/or in the file naming conventions, everything should be just fine.
Let's drop the "file naming" aspect as that is where all of the technical issues come into play. Doing www.webmasterworld.com or www.WebmasterWorld.com doesn't matter. Domain names are not case sensitive. File naming is.
Not mentioning any names but 'someone' in our office recently made some changes in our back end CMS and, as a result, around 30% or the URLs to our site have changed with regards to the casing. For example; www.domain.com/page.htm has changed to www.domain.com/Page.htm.
Both of these URLs work but I'm wondering what the best way to handle the situation is as most of the URLs in the Google index are the old versions and Google could possibly see all of these pages now as being duplicates.
Both of these URLs work but I'm wondering what the best way to handle the situation is as most of the URLs in the Google index are the old versions and Google could possibly see all of these pages now as being duplicates.
You are correct. Depending on your platform, and whether or not you have the capabilities to do so, you'll need to make sure that one or the other is "forced". The recommended would be lower case for file names as they are case sensitive. You can force the Pascal Casing for "file naming conventions" but I do believe you'll add a "challenging layer" in your indexing process.
Not mentioning any names but 'someone' in our office recently made some changes in our back end CMS and, as a result, around 30% or the URLs to our site have changed with regards to the casing.
If those are getting indexed right now, you'll need to make sure that a 301 is being returned once you force the case.
Due to my legacy webserver, I wasn't able to do a Mod ReWrite to fix the problem, and the only way I can address it is via ROBOTS.TXT Disallow control. That is, I specifically disallow all lowercase URL entries for effected sites.
Failure to control it, caused me duplicate content headaches, which I now have under control.
But, if I had to do it all over again - I would have stuck with an all lowercase URL naming convention.