Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

To redirect or Not to Redirect?

Moved pages to exclude the?Parameter

         

Kelcor

6:26 pm on Feb 22, 2007 (gmt 0)

10+ Year Member



We changed many URLs from a dynamic product.aspx with a?parameter, to a "static" page with no parameter. However, the old product.aspx?Parameter ulrs still work, but are no longer linked to from our website.

Do I worry about a few hundred “all at once” 301 redirects or is this not a big deal? Google has the “No Sneaky Redirects” in the webmaster guidelines. I am pretty sure we fell into a filter once before by doing something similar.

We currently have a <meta noindex, nofollow these pages… Google should drop them and pickup the new ones naturally… Right?

sublime1

9:02 pm on Feb 22, 2007 (gmt 0)

10+ Year Member



Hi --

We did a complete site move of several hundreds of pages last summer. It went flawlessly, and quickly. I had posted our experience up here after the fact, and several others reported their positive experiences.

"Sneaky redirects" refers to cases such as a page that returns an HTTP 200 result (ok) containing one kind of content, but with one of a number of methods that the bots will not follow, just as a javascript redirect that would lead a user to a different page. What you're describing is the correct use of 301's.

Just make sure to test, test, test. 301 means "Moved Permanently" and will result in Google (and other search engines) effectively replacing your old URLs with the new ones in their index. If you mess up, you can end up with a big mess that's kind of hard to back out of.

So by all means, do the project. Just be careful to do it right :-)+

tedster

9:09 pm on Feb 22, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My preference is only to redirect key pages -- those with strong backlinks (so the backlink influence is not lost) or those that are strong site entry pages (so the traffic is not lost). I just let the other urls return a 404 status and let Google sort through the changes by spidering the new anchor tags.

I've succesfully used this approach for very big sites, where placing and maintaining hundreds of thousands of 301s seemed like a crazy idea.

sublime1

9:29 pm on Feb 22, 2007 (gmt 0)

10+ Year Member



Tedster -- I certainly agree that if there were specific efforts needed to redirect each page, your approach would be the right way.

But in our case, our sites are dynamic and fairly consistent, so we were able to either do Apache rewrite rules that mapped an old pattern to a new one, or to do quick database lookups as the request came in in order to generate the correct new "static" URL which we returned in the 301 redirect. I am guessing this would be easy in the case noted here, since it looks like it's a dynamic site.

There are a couple of downsides, for me, of letting pages just return 404s. It's not a very nice user experience, of course. But also it's hard to separate out the real errors from the "intentional" ones. I check my analytics every day, and it's very easy to see if there's a problem when there are only 20 or 30 unique errors each day. In addition, the search engines seem to remember and respider 404s (and even 410's, which mean the page is GONE) for ... years in some cases. Leaving dead pages out there makes things really messy.

So in my view, if it's easy to do, you may as well redirect everything.