Forum Moderators: Robert Charlton & goodroi
I have just signed up to this forum today after weeks of reading up on posts and find this site as an invaluable tool so thanks to all of those who post on here.
The company i work for which is a retail company has setup ISAPI filter on one of our development servers to achieve friendly URLs.
The development website has been setup and when browsing the dev website all looks well and old urls such as
www.mysite.com/productpage.aspx?id=123456&itemcode=ABCDE are now showing as
www.mysite.com/MainDepartmentName/SubDeptName/ProductName/
So all looks great.
We want to load this to our live servers but before we do such a thing I need to ensure that we fully and throughly test the implemntation,
My worries are mainly around google and how google will view the website as we dont want to negatively effect our current ranking and get penalised.
Can anyone shed any light as to what i should be focusing on in terms of testing the implementation, i have written down so far
1)Ensure that all links on website have been converted to static URLs to avoid having any duplicate pages.
- Add old dynamic URLS to the robots exclusion file to stop search engines from spidering the dynamic URLS thus ensuring that they only pick up the friendly urls?
2)Check on a redirect what HTTP Status code has been returned, to ensure correct status code is returned, ideally we want to be seeing 301 redirects?
3)What happens at each level of the URI? What server responses and content are being returned to the visitor? Can I interject something into the URI and have it resolve (return a 200 status)?
- Add old dynamic URLS to the robots exclusion file to stop search engines from spidering the dynamic URLS thus ensuring that they only pick up the friendly urls?
You should not add old dynamic URLs to robots.txt. Instead, you should issue 301 permanent redirect to a friendly URL when the old dynamic URL has been requested.
If you exclude old dynamic links via robots.txt then google will never see 301 redirect from the old to a new URL and you will not pass on the link juice from your old URL to a new one.
Also, I am not sure I would personally go for a "big bang approach" change across the site. I would personally pick a subset of URLs, rewrite/redirect these, follow what happens to ranking and then decide a strategy for moving forward.
Yep use an online header checker to make sure your 301s are working properly.
What happens at each level of the URI? What server responses and content are being returned to the visitor? Can I interject something into the URI and have it resolve (return a 200 status)?
I don't understand. What is "each level of the URI"?
You should be sending a 4094 for pages that do not exist. You can set up folder security so people cannot navigate to a folder.
Cheers
Sometimes scripted rewrites end up allowing a spurious extra directory to be added to the full url and have it still resolve, in other words, these two:
example.com/department/subdepartment/product/
...and
example.com/department/subdepartment/123/product/
would both resolve to same content. It's more common to see rewrites keying off a number in the url and the "keyword" can be any old typo:
example.com/department/subdepartment/123/
...and
example.com/department/any-old-garbage/123/
SunnySeptember, you also might check on how your schema handles incorrectly doubled forward slashes, i.e.
example.com/department//subdepartment/123/product/
In terms of not going for the big bang approach of changing all URLs on the website to friendly ones wouldn't a search engine spider successfully index HTTP 200 and 301 responses, and for the latter it will permanently record the new URL location for the Web resource and migrate the previous Web resource’s page rank and history?
So as long as I can ensure that 301 redirects are occuring for pages which now have a friendly urls my page rank and link equity should be preserved and thus no risk in how google ranks our site?
On the other hand you could do 10% and make sure everything works well.
Tedster may have experience with this though. Maybe rankings/ traffic will drop for a period and then resume?
Still, because stuff does happen, I often recommend a small "war chest" of funds, held in reserve for possible PPC to boost temporarily lost traffic on keywords that are critical for revenue.