Forum Moderators: phranque
144.217.15.xxx [06/May/2018:19:46:11 GET /robots.txt HTTP/1.1 500 - - Mozilla/5.0 (compatible; SiteExplorer/1.1b; +http://siteexplorer.info/Backlink-Checker-Spider/)
144.217.15.xxx [06/May/2018:19:46:20 GET / HTTP/1.1500 - - Mozilla/5.0 (compatible; SiteExplorer/1.1b; +http://siteexplorer.info/Backlink-Checker-Spider/)
SetEnvIf User-Agent SiteExplorer keep_out
158.69.252.xxx [06/May/2018:09:03:03 GET /robots.txt HTTP/1.1 500 - - Mozilla/5.0 (compatible; SiteExplorer/1.1b; +http://siteexplorer.info/Backlink-Checker-Spider/)
158.69.252.xxx [06/May/2018:09:03:14 GET / HTTP/1.1 500 - - Mozilla/5.0 (compatible; SiteExplorer/1.1b; +http://siteexplorer.info/Backlink-Checker-Spider/)
167.114.219.xx [09/May/2018:04:54:43 GET /robots.txt HTTP/1.1 500 - - Mozilla/5.0 (compatible; SiteExplorer/1.1b; +http://siteexplorer.info/Backlink-Checker-Spider/)
167.114.219.xx [09/May/2018:04:54:56 GET / HTTP/1.1 500 - - Mozilla/5.0 (compatible; SiteExplorer/1.1b; +http://siteexplorer.info/Backlink-Checker-Spider/)
I was expecting a 403 due to a UA rule (below) but received a 500 instead?Do all intended 403s receive a 500 instead, or only some of them? If it’s all of them, the next question becomes: Is your host doing it on purpose (why, for ### sake?) or does it reveal some underlying ineptitude on their part? If it’s only some of them, we’ll need to take a closer look and try to figure out the variable.
[Thu May 10 18:07:07 2018] [error] [client 148.251.176.xx] client denied by server configuration: /home/example.com/public_html/subdir/2009
SetEnvIf User-Agent SiteExplorer keep_out
Remember that mod_setenvif itself does not issue 403s.
What does this then return as an error number?Nothing. The point is that mod_setenvif in and of itself does not do anything about access. Its only function is to set environmental variables, which can later be used by other mods, including mod_authdoodad and mod_rewrite, both of which can make access decisions.
order allow,deny
allow from all
deny from env=keep_out