|Rewrite urls to absolute path?|
Is there any way to rewrite a path to an absolute url rather than a relative url address? The reason why I need to do this is because I am rewriting URLs on a subdomain, but my forum is located on the top level domain and shares the urls from the subdomain. But since I am rewriting urls on the subdomain to a relative path (the only way I know how to accomplish the rewrites), when I am on the top level domain those rewritten urls although correct, point to the top level domain path, which is incorrect.
An example of my current htaccess code is:
RewriteRule ^(.+)$ wiki/index.php?title=$1 [PT,L,QSA]
What I would like to accomplish is this:
RewriteRule ^www.absolute_url_path_example.com/(.+)$ wiki/index.php?title=$1 [PT,L,QSA]
Odd problem, but my wiki and forum are on two different subdomains, and I have a bridge between the two, so wiki activity is possible on the forum. Links must be absolute though so the forum can point to the correct rewritten urls.
For a rewrite, the left side pattern is for the path-part of the URL 'used on the web' and the right side target is the 'internal server folder and file path' of the physical location where the content is actually located.
A rewrite 'associates' an external URL with an internal file location.
The controlling .htaccess file needs to be located in the root of the folder that corresponds to the root of the subdomain as seen from the web. The target filepath needs to specify the full filepath (as seen inside the server) pointing to where the content is located (in terms of internal folder names, not in terms of URLs).
What's the problem?
The problem is this:
This code in an htaccess file in the subdomain root, RewriteRule ^(.+)$ wiki/index.php?title=$1 [PT,L,QSA], produces urls that look like this on the subdomain, when moused over:
When I go to the top level domain (since this wiki is shared on the forum, which is located on the tld) the URL's are written correctly, but looks like this when moused over:
The domain must be www.subdomain.example.com, not www.example.com. I am wondering if htaccess can fix something like this. Or would I need to alter something in the php coding?
Is that /CorrectlyWrittenURL the same page being accessed from two different URLs, or is that two completely different pages?
If it is the same page, then a 301 redirect would stop the duplicate being indexed, but you must also edit the PHP file to 'make' the correct URLs within the links in the page.
I am trying to avoid php editing. Perhaps there is another alternative. Basically I only want to rewrite wiki urls, and the urls are not in a folder. This makes it difficult to find a unique match. But then I realized .....wiki urls do not have any extension such as .html or .php. I was thinking perhaps I can negative match an extension, then rewrite it to the subdomain. For example, if the url does NOT contain .html, perform the rewrite.
The current code that works without negative match is as follows:
RewriteRule ^(.*)$ [wiki.example.com...] [R=301,L]
Could you direct me in the right direction of how to get the above logic to work for a negative match of .html?
RewriteRule !^(.*)\.html$ [wiki.example.com...] [R=301,L]
But that doesn't seem to work.
Here is where you've gone wrong:
|This code in an htaccess file in the subdomain root, RewriteRule ^(.+)$ wiki/index.php?title=$1 [PT,L,QSA], produces urls that look like this on the subdomain, when moused over: |
As g1smd pointed out above, mod_rewrite does not "produce URLs." It internally rewrites URLs requested from the Web, changing the internal filepath associated with that URL.
To "correct" your URLs, you must produce correct URLs on your pages. So unless you store canonical URLs in your database, you must edit your script.
Once a request arrives at your server (where mod_rewrite can do something with it), it is too late: The URL has been moused over, clicked on, used to send an HTTP request to your server, and has arrived at your server. It is now too late to "hide" it.
Now, you can redirect a URL to another URL. But that ends the current HTTP transaction by sending a redirect response to the client, asking it to use a different URL and re-request what it has already asked for. Therefore, the client must make a second request to get the desired content. The result is two HTTP request/response transactions to your server for every page requested. This pollutes your logs, skews your stats, slows down the user experience, and complicates your 'trust' relationship with search engines. Therefore, it's of little help here. Edit your PHP script or your database.