Forum Moderators: phranque

Message Too Old, No Replies

SSL on 3 pages, non-ssl on the rest

         

andy7t

11:07 am on Jul 7, 2009 (gmt 0)

10+ Year Member



Hi,
I'm getting really stuck with htaccess and SSL.

What i'm trying to do is:

Make sure anybody visiting
checkout.cgi, checkout2.cgi and checkout3.cgi are using SSL.

I also want to make sure anybody visiting any other page is NOT using SSL.

I've tried this but this gives 500 internal server error.

RewriteEngine On
RewriteCond %{HTTPS} != on
RewriteCond %{REQUEST_URI} ^checkout.cgi$
RewriteRule ^(.*)$ [example.co.uk...] [L,R=301]

RewriteEngine On
RewriteCond %{HTTPS} != off
RewriteRule ^(.*)$ http://www.example.co.uk/$1 [L,R=301]

Any help would be very much appreciated.
Thanks

[edited by: jdMorgan at 3:45 pm (utc) on July 7, 2009]
[edit reason] example.co.uk [/edit]

jdMorgan

3:44 pm on Jul 7, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What do you see in your server error log? (That info can be quite helpful)

Jim

jdMorgan

4:00 pm on Jul 7, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The first line below may be required to enable mod_rewrite on your server and to disable content-negotiation. Content negotiation can interfere with mod_rewrite, and should be disabled if your site does not depend on it.

Other changes are to use the server variable "SERVER_PORT" instead of the mod_ssl variable "HTTPS", and to prevent an "infinite" redirection loop due to not testing for "NOT checkout.cgi" in your original second rule and a RewriteCond pattern error in your first rule.


Options +FollowSymLinks -MultiViews
RewriteEngine on
#
RewriteCond %{SERVER_PORT} !=443
RewriteCond $1 =checkout.cgi
RewriteRule ^(.*)$ https://www.example.co.uk/$1 [R=301,L]
#
RewriteCond %{SERVER_PORT} =443
RewriteCond $1 !=checkout.cgi
RewriteRule ^(.*)$ http://www.example.co.uk/$1 [R=301,L]

Before deploying this code, make sure that all on-site links are correct as regards http/https protocol. Adding these redirects before correcting the on-site links will result in slow site performance and polluted log/stats files due to the unnecessary redirects on every client request resulting from clicking on a 'bad' link -- and note that you'll get more than one redirect per click, since requests for objects such as images, stylesheets, and external JavaScript files included by the 'page' will also be redirected; Adding these rules should be considered a 'cleanup expedient' for incorrect search engine listings, and not as a cure for existing incorrect links on the site.

Jim