Welcome to WebmasterWorld Guest from 3.234.210.89

Forum Moderators: Ocean10000 & phranque

Require expr prohibits letsencrypt certbot

letsencrypt reports apache plugin not working

     
3:38 pm on Sep 12, 2019 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 14, 2008
posts:3286
votes: 19


A possible syntax problem?

Require expr %{REQUEST_URI} in { '/robots\.txt' }

This seems to work in that requests for robots.txt are all passed as good and sudo apachectl configtest returns no errors.

When letsencrypt's certbot tries to renew certs it reports...

"The error was: PluginError('There has been an error in parsing the file /etc/apache2/use-setenv.conf on line 270: Syntax error',)"

The above Require is on line 270 in that file. When I remove that line certbot performs correctly. So something is not right. Any ideas, folks?
12:42 am on Sept 15, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member penders is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2006
posts: 3153
votes: 7



Require expr %{REQUEST_URI} in { '/robots\.txt' }


This directive appears to be syntactically valid (as the config test reports). So, it's unclear what "syntax error" is referring to - it is a bit vague.

Although the backslash escaped "literal" dot would seem to be unnecessary, as this is a string argument, not a regex - although the backslash escape is permitted here, so it shouldn't make any difference. (Aside: Why use the "in" operator and not equality when matching a single argument?)

However, presumably there is more to this rule block than what you have posted? In isolation this directive doesn't make a whole lot of sense as it appears to "only" allow requests to "/robots.txt" - which would certainly result in the certbot failing to renew the cert (and render your site pretty useless).
1:52 am on Sept 15, 2019 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11873
votes: 245


Require expr %{REQUEST_URI} in { '/robots\.txt' }


as penders noted:
Although the backslash escaped "literal" dot would seem to be unnecessary, as this is a string argument, not a regex


in - string contained in wordlist

source: https://httpd.apache.org/docs/current/expr.html#other

"in" specifies a "string" rather than a "regular expression".
2:11 pm on Sept 15, 2019 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 14, 2008
posts:3286
votes: 19


Thanks for the replies, guys.

> backslash escaped "literal" dot would seem to be unnecessary

I'm going on examples found online plus the fact that expr, as I understand it, implies regex?

> Why use the "in" operator and not equality

I found no example of a single argument that would work; this seemed to work. I did not notice the "string" part of "in". The example on the apache site from which I derived this (eventually!) gives...
Require expr %{HTTP_USER_AGENT} != 'BadBot'

I tried a variety of equality and (on a different requirement) partial negations from the apache site's Binary operators: Comparison operators table and got nowhere (page: docs/2.4/expr.html to which the link above resolves). Obviously I'm not understanding what I'm reading. I tried
=~ String matches the regular expression

which results in an error, with or without {}.

The Other section is bereft of examples so I was reliant on examples from elsewhere but could find nothing suitable.

> presumably there is more to this rule block

This is part of an opening block for a "Require none" set that allows good bots and blocks bad bots (see an earlier, recent posting of mine). Other than the letsencrypt problem this seems to work fine but I'm always open to correction. :)
5:22 pm on Sept 15, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15934
votes: 889


<tangent>
Option B is to continue doing what you presumably did in 2.2: make a <Files> envelope for robots.txt, containing the single line “Require all granted"
</tangent>
9:49 pm on Sept 15, 2019 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 14, 2008
posts:3286
votes: 19


I never had 2.2. Straight to 2.4 at the start of the year. I'll keep that in mind if I can find no other solution, though. Thanks.