Forum Moderators: phranque

Message Too Old, No Replies

Yet another conditional redirect question

         

b8caster

5:54 am on Apr 4, 2011 (gmt 0)

10+ Year Member



Hello,
I almost have it figured out, but need the final missing link to make it work right. Here's what I want:

I have a whole bunch of articles with URLs like this:

htt​p:/​/ww​w.domain.​com​/path/​article.p​l?n​um=​129​779​229​1

Where the number value is different for every article.

I want them ALL to redirect to [domain.com...]

So it's multiple (100's) of URLs to one single URL.

Here's the code:
RewriteCond %{QUERY_STRING} ^num=[0-9]+$
RewriteRule ^path/article\.pl$ [domain.com...] [R=301,L]

The result? The above URL will redirect to
htt​p:/​/ww​w.domain.​com​/new-articles/?n​um=​129​779​229​1

So how do I knock off the query parameter and value?

g1smd

7:57 am on Apr 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



What is the ​ repeated multiple times in the URL?

Use example.com to stop the forum auto-linking the URLs.

A question mark after the target URL in the RewriteRule will clear the query string.

If you have done that, then it may be that you have your rules in the wrong order.

b8caster

2:18 am on Apr 5, 2011 (gmt 0)

10+ Year Member



Weird how that happened. Here's the original post again:

I have a whole bunch of articles with URLs like this:

[domain.com...]

Where the number value is different for every article.

I want them ALL to redirect to [domain.com...]

So it's multiple (100's) of URLs to one single URL.

Here's the code:
RewriteCond %{QUERY_STRING} ^num=[0-9]+$
RewriteRule ^path/article\.pl$ [example.com...] [R=301,L]

The result? The above URL will redirect to
http://www.example.com/new-articles/?num=1297792291

So how do I knock off the query parameter and value?

I have this rule first. If that's not the "right" spot in the "right order", then where should it be?

g1smd

6:41 am on Apr 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Use example.com for the example URL to stop the forum auto-linking it.

Put the real code in the RewriteRule not something in square brackets.

b8caster

12:34 pm on Apr 5, 2011 (gmt 0)

10+ Year Member



RewriteCond %{QUERY_STRING} ^num=[0-9]+$
RewriteRule ^path/article\.pl$ http://www.example.com/new-article/ [R=301,L]

g1smd

8:31 pm on Apr 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Add a question mark to the RewriteRule target URL to clear the query string.

Note that your redirect will work only if the "num=" parameter is the only parameter.

You should improve the rule by also redirecting if there are other parameters present.

 RewriteCond %{QUERY_STRING} (^|&)num=[0-9]+(&|$)
RewriteRule ^path/article\.pl$ http://www.example.com/new-article/ [R=301,L]

b8caster

11:41 pm on Apr 5, 2011 (gmt 0)

10+ Year Member



Worked! Thank you!

g1smd

12:06 am on Apr 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The end of my example should have read...

new-article/? [R=301,L]

b8caster

1:46 am on Apr 27, 2011 (gmt 0)

10+ Year Member



Here's what I have, which works:

RewriteCond %{QUERY_STRING} ^num=[0-9]+$
RewriteRule ^path/article\.pl$ http://www.example.com/new-article/? [R=301,L]

But my question is, what if the "num" value contains a "/" in it? For example, the code works for something like this:
http://www.example.com/pat/article.pl?num=09380584
but not for
http://www.example.com/pat/article.pl?num=09380584/43
where the value after "/" can be 1-999

How do I fix the code above to accomodate that?

b8caster

1:47 am on Apr 27, 2011 (gmt 0)

10+ Year Member



I meant /path/ in the examples.

b8caster

3:47 pm on Apr 27, 2011 (gmt 0)

10+ Year Member



Have I stumped the band?

b8caster

4:07 am on Apr 29, 2011 (gmt 0)

10+ Year Member



Hello?

g1smd

7:41 am on Apr 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Since you posted at 1.47 am, 3.47 am and 4.07 am my time, you might imagine I wasn't online. :)

See the [0-9] bit of the pattern. That lists the characters that are valid for the query string value (the part after the "equals" sign).

lucy24

8:13 am on Apr 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



What is the ​ repeated multiple times in the URL?

I had to go investigate this because it would have kept me awake all night. It's the decimal name for ​ aka the "zero-width space". They're invisible to the naked eye, but they allow line breaks. Useful in circumstances where a soft hyphen would be misleading-- such as writing out an enormously long url-- but potentially dangerous if you don't know they're there.

b8caster

1:04 pm on Apr 29, 2011 (gmt 0)

10+ Year Member



How should I change this part:
num=[0-9]+$

Do accomodate URLs formatted with this part:
num=09380584/43

Where "43" Could be any value between 1-999.

Can I use num=[0-9/]+$

Please let me know.

g1smd

4:44 pm on Apr 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



See the [0-9] bit of the pattern. That lists the characters that are valid for the query string value (the part after the "equals" sign).

Can I use [0-9/] here?

Yes, but you could also use a more complex pattern to restrict matches when the second part has more than 3 digits.

([0-9]{5,9}/[0-9]{1,3})


Adjust the {5,9} part to reflect how many digits the first part may have.

b8caster

5:26 pm on Apr 29, 2011 (gmt 0)

10+ Year Member



So simple. Can't believe I missed that.

Thanks for the response!

g1smd

6:12 pm on Apr 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It depends how much you want to "pre-filter" the requests and limit exactly which requests are rewritten.

Requests that don't match the pattern (and don't match any other pattern, and are not serviced by a physical file) will trigger the server's default 404 handler.

I make the pattern as restrictive as possible, rewriting only valid requests.