Forum Moderators: phranque

Message Too Old, No Replies

Redirect 301 with multiple pages

Can a range be used

         

JimmieT

5:23 pm on Oct 15, 2009 (gmt 0)

10+ Year Member



Hi all,
I have this redirect 301 rule:

Redirect 301 /directory/ipb.htm [mysite.com...]

I have been trying to use multiple ".htm" pages to work with this command with no success.

Is there a way to write in the place of "ipb.htm" a range that will include "a" through "z"? Sort of like: ip(a-z).htm (That does not work of course)

Thanks in advance.

Jim

Caterham

5:32 pm on Oct 15, 2009 (gmt 0)

10+ Year Member



See the RedirectMatch [httpd.apache.org] directive.

JimmieT

7:35 pm on Oct 15, 2009 (gmt 0)

10+ Year Member



@Caterham,
Thanks for the quick reply, but I am still too green behind the ears to grasp information in the link you provided.
Jim

g1smd

9:07 pm on Oct 15, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'd recommend looking at
RewriteRule
as that is much more powerful.

Read the documentation several times, and check previous threads in this forum.

Caterham

12:21 pm on Oct 16, 2009 (gmt 0)

10+ Year Member



You need to find a regular expression matching your needs. If you're looking for ipa, ipb, ipc etc. it may look like

RedirectMatch 301 ^/directory/ip[a-z]\.htm$ http://example.com/

JimmieT

2:48 pm on Oct 16, 2009 (gmt 0)

10+ Year Member



Thanks to all that have responded.

I actually figured it out according to the RedirectMatch 301 as per Caterham's last post. Wrote the code and found out it was not what I wanted to do at all. Live and learn.

The problem arose because scrapers must know of a program that stores email addresses in "ipb.htm", which is also a page in a genealogy program I use that does not contain addresses. My site is constantly being scanned for just that page. And, although, the scrapers get nothing, it just annoyed me. (Guess they got something.)

Since, I've taken other steps to annoy the scrapers, just a little more work for me, but well worth it.

Thanks again for the advice, I've certainly learned more.

Jim

jdMorgan

4:52 pm on Oct 16, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If ipb.htm is an 'included' file, not required or intended to be directly-accessible by your Web users, then you could always return a 403-Forbidden response for *any* HTTP request for that file. This would not affect internal operations such as a script reading that file, because a filesystem read is not an HTTP request.

Jim

JimmieT

10:02 pm on Oct 16, 2009 (gmt 0)

10+ Year Member



@jdMorgan

My site is quite basic, static, no scripts, etc. All pages are HTTP requests. Strictly an informational type site, with internal links to its pages, some external links to other sites.

If I could keep that particular page from being indexed, that may be another solution.

Right now I've chanded the file name and links to it, along with the subdirectory name. So, the scrapers get a 404 or a 403 if their ip range has been blocked. Works for me so far.

Thanks for the thought/advise.

Jim