Forum Moderators: phranque
i have next .htaccess file:
RewriteEngine On
RewriteRule ^.*$ index.php [NC,L]
and it works fine on localhost, server redirects all urls to index.php
BUT, it's does not work on another server, it gives 404 error
for all urls except this index.php, and does not make redirect to this adress from other urls.
What can be the reason for such a behaviour?
Thanks.
However, what it does do is rewrite all URL requests, including those for images, CSS files, robots.txt, etc, and feed those requests to the script. That is very dangerous as it cannot process most of those requests. It also allows infinite duplicate content.
If you are wanting a redirect, then the target should be the URL "www.example.com/" and not a named index file. Such a redirect should include the domain name and the R=301 flag. That redirect should only happen for very specific URL requests (e.g. only for URL requests ending in .html for example) and should not happen for images, CSS files, robots.txt etc, and should only happen for direct client requests, not as the result of any internal rewriting.
You likely need to set the FollowSymLinks or SymLinksIfOwnerMatch option on the server.
Also, you need to exclude index.php from being rewritten to itself, and you very likely should also exclude several 'standard' files and media filetypes from being rewritten to your script. The most efficient way is to specify them:
Options +FollowSymLinks -MultiViews
RewriteEngine on
#
RewriteCond $1 !^(index\.php¦robots\.txt¦sitemap\.xml)$
RewriteCond $1 !\.(gif¦jpe?g¦png¦ico¦css¦js)
RewriteRule ^(.*)$ index.php [NC,L]
If you use this concept, be sure to replace all broken pipe "¦" characters with solid pipe characters before use; Posting on this forum modifies the pipe characters.
[added] I started writing this reply before you posted above. While you've taken care of the "index.php" looping problem, do be sure to think about whether you really want your script to handle requests for images, css, JS, and for robots.txt and sitemap.xml, etc. Usually, these requests are not handled by scripts, because it's easier and more efficient to by-pass the script for 'real file' requests or -- thinking about it another way, for non-HTML-page requests. [/added]
Jim