Oh, lordy. I think you're trying to do too much at once and you will only make yourself miserable. Start by studying your existing htaccess. The one you had before you started tinkering with it. Be sure you understand everything that's already there. And then you can start fine-tuning it. Make sure the existing rules are exactly right before you start adding more.
RedirectMatch
This has to go away. Any rule that begins Redirect or RedirectMatch MUST be recast as a RewriteRule or things will happen in the wrong order.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} [Bb]aiduspider
RewriteRule .* - [R=403,L]
</IfModule>
<snip>
RewriteEngine On
<snip>
RewriteEngine On
:: wanders off sobbing brokenly ::
Yes, I think the RewriteEngine is now On. All those RewriteRules need to be gathered into one place and arranged in the right order. Then you can hammer them all into the right form. First step is to ditch the "If Module..." envelope. Not its contents, just the envelope itself. You either have mod_rewrite or you don't-- and if you don't, you need to change hosts yesterday. Besides, your CMS wouldn't work without mod_rewrite.
Options +FollowSymlinks
<snip>
Options -Indexes
Collect all your Options into a single line. Put them near the top of your htaccess where you can keep an eye on them. Other things that go near the top are one-liners such as ErrorDocument statements and generic AddType or Expires lines. This has nothing to do with Apache execution; it's for your own sanity.
Extra URL trailing slashes "/"
Slow down. You can't go wantonly chopping off slashes-- or adding them on-- and not expect consequences.
If the URL represents a real, physical directory, it will pick up the slash without you having to do anything about it. You just need to make sure it doesn't also display "index.php" or whatever the index filename is.
If your URL does not represent a real, physical directory, then you need two separate rules. First a redirect to grab your users by the scruff of the neck and force them to use the URL format you want. And then a rewrite to fetch the content from wherever it really lives. But not yet.
<FilesMatch "(?i)((\.tpl|\.ini|\.log|(?<!robots)\.txt))">
Order deny,allow
# Deny from all
</FilesMatch>
This is way too complicated and probably not even necessary. I seriously doubt your logs are slopping around loose in the same directory as your site; far more likely they're aliased to a completely different part of the server where robots could never get to them. What you do need is a simple
<Files "robots.txt">
Order Allow,Deny
Allow from all
</Files>
to override any Deny directives you've got lying around.
RewriteCond %{HTTP_USER_AGENT} [Bb]aiduspider
RewriteRule .* - [R=403,L]
An admirable sentiment :) But you don't need mod_rewrite for this. The easiest way to do simple User-Agent blocks is to run up a list in mod_setenvif like this:
BrowserMatchNoCase BaiduSpider get_lost
BrowserMatch Slurp get_lost
BrowserMatch AppEngine get_lost
et cetera. And then at the top of your Deny directives, before you start listing unwanted IPs, say
Deny from env=get_lost
If you do need mod_rewrite to lock someone out, the rule ends in a simple [F] flag.