|URL Rewrite without 301 Redirect, Alternative Solutions|
Looking for alternative solutions to a URL rewrite without 301 redirect
We've just done a URL rewrite from dynamic to static URLs. Our website is a typical e-commerce site which requires registration (Pardon my rookie language, I'm not actually a developer but more of a content management/optimization analyst for this website).
The links with the new URL structures are now appearing on the home page of our website. However, when a user registers or logs-into the user interface they can still see the old URL structures.
I'm still waiting for an explanation as to why the developers haven't implemented 301 redirects as instructed. But it's been quite a while already and I needed an alternative solution or I'm facing duplicate content issues with several pages of the website.
I was thinking of excluding the old URLs from the search engine spider crawling through robots.txt. But I'm not sure if this would be a good cure to the situation.
Can you please advise? Thank you.
[edited by: jdMorgan at 1:01 pm (utc) on Oct. 21, 2009]
[edit reason] example.com [/edit]
Requests for the old URLs should indeed be redirected to the new URLs. That redirect must be a 301 redirect.
However this bit... "when a user registers or logs-into the user interface they can still see the old URL structures" ...points to a much bigger problem. You must also LINK to those new URLs from the pages of your site. It is links that define URLs and so if you want the user to see new URLs then those are what you must have in your links.
The redirect is also required to 'make good' any requests for old URLs coming from links on other sites, stored in user bookmarks, or already indexed, and displayed by searchengines.
Exactly my point. I totally agree with your points @g1smd. I'm still waiting from our developers for the reason why they won't be able to install the redirects.
So I guess my only solution is the 301 redirect and there's no workaround to this.
Thanks for your inputs though...
If you cannot link to the new URLs from within your own pages, then the entire plan should be scrapped. It is 'really bad news' to link to a URL that always returns a 301 redirect; This requires the user's browser to issue two consecutive requests per object (a first for the old URL, followed by a second as a result of the server's redirect response to the first request). This makes a mess of your logs and stats (the number of requests per object doubles), slows down the user experience of your site, and speaks of 'low technical quality' to search engines.
Here's the plan:
1) Internally rewrite new URLs to the correct internal filepath (likely the pre-existing filepath, no longer seen as a URL)
2) Link exclusively to new URLs from within your own pages (likely requires script change or output filter)
3) Request inbound link updates from your most important linking partners
4) Only when above three steps are complete, externally redirect direct client requests for old URLs to new URLs (to recover traffic and ranking factors from pre-existing inbound links, bookmarks, and type-ins from offline media). This function requires care: Note that only client requests should be redirected; Internal requests occurring as a result of the internal rewrite in step 1 must not be redirected, or you get an 'infinite' loop.
Note that internal rewrites and external redirects are totally different functions. Don't allow anyone to work on this project who uses these terms interchangeably... Disaster will likely ensue. Since revenue is affected, this must be treated as "one chance to get it all completely right," so test, test, test.
[edited by: jdMorgan at 1:45 pm (utc) on Oct. 21, 2009]
Your answer has to be the best that I've seen so far in my efforts of asking developers. Although this is the first time I've learned the difference of internal rewrites and external redirects, this is really worth looking into and testing out.
Again thanks for sharing your expertise. Appreciate it!