This is my robots.txt file. Could you tell me if I am preventing MediaWiki from providing a "404 Not Found"?
User-agent: *
Disallow: /index.php
Disallow: /w/
Disallow: /Category:*
Disallow: /Category_talk:*
Disallow: /Extension:*
Disallow: /Extension_talk:*
Disallow: /File:*
Disallow: /File_talk:*
Disallow: /Game*/
Disallow: /Image:*
Disallow: /Image_talk:*
Disallow: /Help:*
Disallow: /Help_talk:*
Disallow: /Manual:*
Disallow: /Manual_talk:*
Disallow: /Media:*
Disallow: /MediaWiki:*
Disallow: /Media Wiki_talk:*
Disallow: /Project:*
Disallow: /Project_talk:*
Disallow: /Special
Disallow: /Special:*
Disallow: /Talk:*
Disallow: /Template:*
Disallow: /Template_talk:*
Disallow: /User:*
Disallow: /User_talk:*
User-agent: ia_archiver
Disallow: /
Allow: /Special:Contact
sitemap: http://example.com/sitemap.xml
This is my .htaccess file:
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.+)$ index.php?title=$1 [L,QSA]
Options +FollowSymlinks
RewriteEngine on
# Link for the Sitemap
RewriteRule ^sitemap(.*)\.xml$ sitemap.php?page=$1 [L,NC]
RewriteCond %{HTTP_REFERER} !^http://example.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://example.com$ [NC]
RewriteRule .*\.(jpg|jpeg|gif|png|bmp|mp3)$ http://example.com/Hotlink_Protection [R,NC]
If I'm not preventing it then MediaWiki musn't be working properly because if it was giving 404 Not Found then I wouldn't be able to find the non-existant pages via Google?
[edited by: tedster at 6:14 am (utc) on Mar 24, 2012]
[edit reason] switch to example.com [/edit]