Forum Moderators: phranque
So, is there any way I can use .htaccess to serve all robots a 403 when they try to visit a URL that contains certain words? If so, how would I do it? I'd like to be able to filter for a few key words so those URL's don't get indexed anymore; after that I'll get going on the long and tedious task of removing all those URL's from the Google index . . .
Unless, of course, duplicate content won't be a problem in this case and the extra "pages" might help my rankings? ;)
Thanks,
Matthew
Take a look at these...
[webmasterworld.com...]
[dmoz.org...]
[webmasterworld.com...]
I don't exactly know what rule you would need as I don't know how your URLs look. Anyway, you need a two part rule using RewriteCond and RewriteRule. In this case you would use Rewritecond to check the Agent Name and see if it is a bot.
There is one issue with this however. Serving different content to SE bots is something you can get penalized for too. I don't know if this goes for a 403 too however.