Forum Moderators: phranque
instead of like some of these:
http://www.example.com/index.php?x=about/me/favorites
http://www.example.com/index.php?page=about&category=me&id=favorites
http://www.example.com/index.php?page=favorites.php
I'm a beginner at PHP, and I don't know where to start. I can't find anything on Google to help me, because I don't know what to look for.
I already know how to remove index.php by using something like this:
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ (.*)index\.php(.*)\ HTTP/
RewriteRule ^(.*)index\.php(.*)$ $1$2 [R=301,NC,L]
Beyond that, I have no idea.
I tried to access
http://www.example.com/index.php?page=about&category=me&id=favorites
on my own site (offline, with xampp) by going to
http://www.example.com/index.php/about/me/favorites/
and my stylesheet didn't load. The breadcrumb navigation, the title of the browser window, the actual content of the "favorites" page, and the navigation bar changed to reflect my location, but no CSS.
(I can post the code that I'm currently using, if necessary.)
In short, I need a tutorial or a link to a site that explains how to have dynamic includes with URLs like this:
http://www.example.com/about/me/favorites/
Thanks in advance for any assistance.
Solutions:
Remember,
The solution of adding rewrite rules to fix this problem has a potentially-serious downside: If you simply remove the extra URL-path information to point these requests back up to the root directory, but do not explicitly check that the URL is canonical and valid, then the new rule creates duplicate-content, because the included objects could be linked-to with *any* 3-level-deep URL-path, i.e. "/nasty/competitor/landmine/logo.gif". Therefore, I suggest correcting the links on your pages instead.
Jim
[edit] Ficksed speling errers [/edit]
[edited by: jdMorgan at 6:07 pm (utc) on Sep. 29, 2008]
Redirect parameter-based URL requests to the folder-based URL format; force www into the URL at the same time for these.
Strip index names off URL requests with another 301 redirect; force www at the same time for these.
Redirect all non-www requests to www for everything else, non-www, that is requested.
Finally, do your rewrite to connect the requested URL to the dynamic internal filepath.
Make sure that all of the links on the page do now point to the correct version of the URL.
I posted an example for two variables yesterday. It's not a great leap to modify that for three.
(I'd check it right now, but I do my development on another laptop, which isn't with me at the moment.)
So if that fixes that problem, then it's a simple process of using .htaccess to remove the index.php, and redirect the old URLs to the new URLs, right?
Thanks for your help...I'm always overlooking the most obvious cause of the problem. :)
g1smd:
I actually don't like www (I just used it in my example URLs), so can I force no-www instead, and it still work right? And I read your example earlier today, and realized that I'd need to do something so that the old URLs wouldn't be "dead" so thanks for yesterday's post. :)
site:domain.com -inurl:www
You can't do that if the URL you use already doesn't have the www in it.
Even if the site uses the www there are no issues whatsoever in using "domain.com" in your branding and in adverts.
.
For example, we all refer to Google as google.com but look what happens if you type google.com into your web browser. Yes, it redirects to www.google.com.
Also, since many people insist on pronouncing URL as "Earl" in spoken conversations, it's probably good not to confuse the nobility (Earl) with the first Russian cosmonaut (Yuri)... :)
Your "remove index.php" code is rather awful -- probably copied from somewhere else, as most examples here at WebmasterWorld use much-more-specific (and efficient) regular expressions patterns. A good rule of thumb at detecting inefficient and potentially-problematic regex patterns is this: You should almost never use two ".*" subpatterns in the same pattern, and in general, you should only use the ".*" pattern when no other pattern will work.
The ".*" pattern is easy to understand, but it is greedy, promiscuous, and ambiguous; Using more than one in a pattern forces the pattern-matching engine to try dozens, hundreds, thousands, tens of thousands, or even more "trial fits" attempting to best match the requested URL-path to the various "pieces" of the multiple-".*" pattern. With a high-traffic server, multiple ".*" subpatterns in several rewriterules' patterns, and long URLs, this can have a very bad effect on server performance, and may in fact be the reason that many sites are forced to upgrade from shared hosting to dedicated or VPS in order to maintain acceptable site performance...
Anyway, search this forum for 'Redirect index to "/" rewritecond rewriterule' for examples of the same code with more-efficient (more-selective) patterns, and never use ".*" or ".+" patterns if you can avoid them.
Jim
Everything is working perfectly now.
g1smd, I hadn't thought about that. I'll reconsider.
Thanks again to both of you. I still can't believe the solution was as simple as changing <link ....>.