|Help with URL re-writing - Mediawiki|
| 1:03 pm on Dec 8, 2008 (gmt 0)|
I am in the process of setting up a wiki on my site using mediawiki. I am trying to set up short urls using .htacess. I have the following:
RewriteRule ^wiki/(.*)$ w/index.php?title=$1 [PT,L,QSA]
RewriteRule ^wiki/*$ wiki/ [L,QSA]
RewriteRule ^/*$ wiki/ [L,QSA]
Mediawiki is installed in a directory /w. The above works perfectly apart from the url www.mysite.com redirects users to the main page of the wiki. How do I correct this? Someone please help, I dont know much about URL rewriting. I basically want it to act like it does now, without this problem.
| 3:53 pm on Dec 8, 2008 (gmt 0)|
Only your first rule will ever be applied, because the RewriteRule pattern in that rule matches anything that the one in the second rule might match, so the second rule will never execute.
The third rule will never match anything, because URL-paths 'seen' by RewriteRule in .htaccess will never start with a slash. The correct pattern to match a request for example.com/ would simply be "^$".
Put your rules in order from most-specific to least-specific -- Special cases need to go first, or you must add exceptions to your 'early' rules by using RewriteConds.
Also, you don't need to specify [QSA] if the RewriteRule's substitution (new) URL or filepath does not contain a "?". The default behavior of mod_rewrite is to pass query strings through unchanged.
| 4:27 pm on Dec 8, 2008 (gmt 0)|
Thanks for the reply. Didnt really understand much of what you said as I do not really have any experience url re-writing. However I have got it to work.
I just commented this line out:
##RewriteRule ^/*$ wiki/ [L,QSA]
All is working as it should now.
| 11:42 pm on Dec 8, 2008 (gmt 0)|
It would be a good idea to read it again, as there were several valid points which your "solution" does not address.
| 12:55 am on Dec 9, 2008 (gmt 0)|
Well its working how I need it too so that will do me. Its taking me all my time getting to grips with mediawiki
| 4:49 pm on Dec 9, 2008 (gmt 0)|
Getting to know your server and how to configure it before layering another level of complexity on top of it would be a *really* good idea. Do not build your house on swampy ground, else it soon comes crashing down...
Being practical, you may not feel that you have time for this right now. But do come back to it -- and soon. Otherwise, you may have serious problems with server operation and ranking problems because there is some "hole" in your server configuration that allows these problems to happen. The Google forum is full of threads from Webmasters that did not build a proper foundation for their sites, and are now suffering the consequences.
The idea that using a pre-built forum, blog, or wiki means that you don't have to bother with "all that technical stuff" is a dangerous myth. In fact, some of these packages require you to deal with even more server-level issues because their authors ignored those issues and left their users open to major problems.
Two examples are lack of URL canonicalization leading to duplicate content and the resulting ranking problems, and the widely-known problem of security exploits in un-patched popular software.