Forum Moderators: phranque
RewriteEngine on
# -FrontPage-
IndexIgnore .htaccess */.?* *~ *# */HEADER* */README* */_vti*
<Limit GET POST>
#The next line modified by DenyIP
order allow,deny
#The next line modified by DenyIP
#deny from all
allow from all
</Limit>
<Limit PUT DELETE>
order deny,allow
deny from all
</Limit>
AuthName example.com
AuthUserFile /home/elpmaxe/public_html/_vti_pvt/service.pwd
AuthGroupFile /home/elpmaxe/public_html/_vti_pvt/service.grp
<Files 403.shtml>
order allow,deny
allow from all
</Files>
order allow,deny
deny from 188.138.188.34
deny from 178.159.37.61
deny from 93.188.34.197
deny from 89.109.2.77
deny from 31.130.2.79
deny from 2.224.128.114
deny from 51.15.88.249
deny from 93.100.128.3
deny from 91.215.106.53
deny from 104.131.214.218
deny from 91.197.174.108
# php -- BEGIN cPanel-generated handler, do not edit
# Set the “ea-php70” package as the default “PHP” programming language.
<IfModule mime_module>
AddType application/x-httpd-ea-php70___lsphp .php .php7 .phtml
</IfModule>
# php -- END cPanel-generated handler, do not edit
RewriteCond %{HTTPS} off
RewriteCond %{HTTP:X-Forwarded-SSL} off
RewriteCond %{HTTP_HOST} ^example\.com$ [OR]
RewriteCond %{HTTP_HOST} ^www\.example\.com$
RewriteCond %{REQUEST_URI} !^/\.well-known/acme-challenge/[0-9a-zA-Z_-]+$
RewriteCond %{REQUEST_URI} !^/\.well-known/cpanel-dcv/[0-9a-zA-Z_-]+$
RewriteCond %{REQUEST_URI} !^/\.well-known/pki-validation/(?:\ Ballot169)?
RewriteCond %{REQUEST_URI} !^/\.well-known/pki-validation/[A-F0-9]{32}\.txt(?:\ Comodo\ DCV)?$
RewriteRule ^/?$ "https\:\/\/example\.com\/" [R=301,L]
deny from a.b.c.d
deny from a.b.c.0/24will deny a.b.c.0 to a.b.c.255, or 256 IP addresses. The /24 is called CIDR [en.wikipedia.org...] format, very handy to learn.
Of course, since I am now actually paying attention, I have decided to undertake some maintenanceWheee! That's always fun, and you gotta start somewhere.
RewriteRule ^/?$ "https\:\/\/example\.com\/" [R=301,L]If this is, litteratim, what your hosts wrote, do not allow them to make any more RewriteRules for you. (I don't much care for the form of the domain-name-canonicalization conditions either, but one thing at a time.)
I'll address the errant Moldovians. To find out who is visiting your site you should download your raw access log, available on cPanel. This will tell you exactly who is visiting you and which bots (software) are attacking you. If you have a flat file html web site, they cannot damage your site but are annoying. You will need to decide how much effort you want to put into killing them.
188.138.188.34 - - [03/Oct/2018:05:02:14 -0700] "GET /delineament-door-foo/ HTTP/1.0" 404 - "http://example.com/delineament-door-foo/" "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Safari/537.36 Kinza/4.7.2"
188.138.188.34 - - [03/Oct/2018:05:02:15 -0700] "GET / HTTP/1.0" 404 - "http://example.com/delineament-door-foo/" "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Safari/537.36 Kinza/4.7.2"
188.138.188.34 - - [03/Oct/2018:05:02:27 -0700] "GET /delineament-door-foo/ HTTP/1.0" 404 - "http://example.com/delineament-door-foo/" "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.112 Safari/537.36"
188.138.188.34 - - [03/Oct/2018:05:02:27 -0700] "GET / HTTP/1.0" 404 - "http://example.com/delineament-door-foo/" "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.112 Safari/537.36"
188.138.188.34 - - [03/Oct/2018:05:02:31 -0700] "GET /delineament-door-foo/ HTTP/1.0" 404 - "http://example.com/delineament-door-foo/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.87 Safari/537.36 OPR/54.0.2952.51"
188.138.188.34 - - [03/Oct/2018:05:02:31 -0700] "GET / HTTP/1.0" 404 - "http://example.com/delineament-door-foo/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.87 Safari/537.36 OPR/54.0.2952.51"
[edited by: phranque at 1:37 am (utc) on Oct 9, 2018]
[edit reason] some acronyms are worse than others [/edit]
But if you personally fix your htaccess, can you be certain it will remain the way you fixed it? Or will your host barge in and replace the whole thing at any random time without telling you?
The quoted list of RewriteConds don't seem to belong with the single RewriteRule given, since the body of the rule explicitly refers only to requests for the root. Which brings us to ...RewriteRule ^/?$ "https\:\/\/example\.com\/" [R=301,L]
If this is, litteratim, what your hosts wrote, do not allow them to make any more RewriteRules for you. (I don't much care for the form of the domain-name-canonicalization conditions either, but one thing at a time.)
inetnum: 188.138.188.0 - 188.138.188.255
netname: STARNETMD
descr: SC STARNET SRL
descr: Chisinau, Moldova
descr: Region: Chisinau
country: MD
deny from 188.138.188.0/24
or
deny from 188.138.128.0/17
In any event, is a redirect the correct way to force https, or isn't it? If it is, what should the code actually look like? And should it be the first set of commands in the .htaccess file? If it isn't the right way, what is the correct way to do it?A redirect is the only way to force https. How else would you tell the visitor “This request is unacceptable and you need to make a fresh request using the https protocol”? You can’t quietly rewrite to HTTPS the way you can prettify an URL.
RewriteCond %{REQUEST_URI} !^/robots\.txt
RewriteCond %{HTTPS} !on [OR]
RewriteCond %{HTTP_HOST} !^(example\.com)?$
RewriteRule (.*) https://example.com/$1 [R=301,L]
The business about robots.txt is something I discovered the first time I moved a real site (as opposed to my test site that tries everything first) to https. Some legitimate robots seem to get confused when they meet a redirect after requesting robots.txt, so I decided it's safer to serve robots.txt only at the originally requested hostname and protocol, even if it’s otherwise wrong for the site. The guides I have read say that I can do it through cPanel,Funny how everyone assumes that all hosts use cPanel. Mine doesn’t; it’s one of the bigger hosts, so they roll their own. I think you can do a few access-control things in the Control Panel, but in general you proceed directly to making your own htaccess.
But if you personally fix your htaccess, can you be certain it will remain the way you fixed it? Or will your host barge in and replace the whole thing at any random time without telling you?Having a host that changed my htaccess would be a deal breaker and cause me to change hosts very quickly... and I *hate* changing hosts.
To start I would block 188.138.188.0/24, and then monitor the rest of the range. If, for example tomorrow they use 188.138.189.* you will then expand your block range for this company.
Having a host that changed my htaccess would be a deal breaker and cause me to change hosts very quickly... and I *hate* changing hosts.
But all hosts have things that are irritating; just depends on if you can live with them.
17 RewriteEngine on
18 RewriteRule ^check_work/$ ./barb/pygmalion.php?checkwork
19 RewriteRule ^barb-(.*)/$ ./barb/pygmalion.php?$1
20 RewriteRule ^pygmalion-(.*)/$ ./barb/pygmalion.php?$1
21 RewriteRule ^cetera-(.*)/$ ./barb/pygmalion.php?$1
22 RewriteRule ^auntie-(.*)/$ ./barb/pygmalion.php?$1
...
416 RewriteRule ^valhalla-(.*)/$ ./barb/pygmalion.php?$1
417 RewriteRule ^amplified-(.*)/$ ./barb/pygmalion.php?$1
418 RewriteRule ^notecards-(.*)/$ ./barb/pygmalion.php?$1
Was pygmalion a virus or something circa 2011?If so it was an exceedingly rare one, since site search turns up nothing. I think you said at the outset that you've had this site for ages, so it isn't just someone else's htaccess from the same domain name in 2011.
Now my email clients are suddenly having varying success pulling emails. I suspect that I have caused these issues by trying to change http to https and www.example.com to example.com. Before I blindly try to revert this, I decided to get actual counsel.
I always upload a new version of my htaccess and save the old htaccess as a different name, so if it does not work I can quickly reverse the changes.How admirably thorough. I only keep copies if I've made massive, wholesale changes. Now, what I do consistently do is this: When I've changed htaccess, I leave the text file open while uploading and confirming that the side doesn’t crash. If I do hit a 500 error, meaning some kind of syntax goof, I switch on the text editor's Show Changes highlighter, making it easy to home in on possible problems.
the email problems should have nothing to do with apache unless you are also hosting your webmail (i.e. using HTTP protocol vs SMTP & IMAP/POP3) server.
when you switched web hosts did that affect your email hosting in any way?
assuming your email clients are trying to access SMTP and IMAP/POP3 servers, the problem is probably a matter of who and where your (new?) email service is hosted and whether or not your DNS is properly configured for that.
The security token is missing from your requestbut I can log in and then I can choose horde, roundcube or SquirrelMail.
A fatal error has occurred
Session cookies will not work without a FQDN and with a non-empty cookie domain. Either use a fully qualified domain name like "http://www.example.com" instead of "http://example" only, or set the cookie domain in the Horde configuration to an empty value, or enable non-cookie (url-based) sessions in the Horde configuration.
Details have been logged for the administrator.
I spent time this afternoon on the email.
Always have backed up versions of your htaccess. If you make a spelling mistake your site might go downI copied the original in a txt file on my computer when I created the obfuscated one for here. And I made a copy of the corrected version too. I don't have anything major riding on the webpages working, but I do value my time. And, fixing it blind would be a timesink.
You could check your htacess at [htaccesscheck.com...] but it is not bullet-proof, but will check basic syntax.
Syntax checks out ok!
You can also add comments to your htaccess by putting a "#" as the first character. Comments are always good because if you come back to your htaccess after a couple of months you may forget why you did something.That is wonderful advice. I had thought to do that a little while after I made the changes. Right after I had gone back in and annotated what the block of commands did, I came here and read this comment. Spot on.
since your email- and webmail-related issues are irrelevant to this (Apache) forum, i would suggest that you start a new thread in a relevant forum to discuss those problems - perhaps the Website Technology Issues [webmasterworld.com] forum.
# Block spammy referrer
RewriteCond %{HTTP_REFERER} ^http://.*mydomain\.com/that-page/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*example\.foo\.foofoo\.pw [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*in\.foofroo\.foo\.foo\.in [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*foo\.confoos\.foo\.in [NC]
RewriteRule .* - [F]
RewriteCond %{HTTP_REFERER} \b(example|otherexample|thirdexample)\b
RewriteRule ^that-page - [F]
If the request isn't for that-page, the server does not need to backtrack and evaluate conditions at all. RewriteCond %{HTTP_REFERER} \b(mydomain\.com/that-page/|example\.foo\.foofoo\.pw|in\.foofroo\.foo\.foo\.in|foo\.confoos\.foo\.in )\b
RewriteRule ^that-page - [F]
Don't use [NC] unless you have genuinely seen more than one casing of the offending domain.I am not sure what you are saying. 40% of the hits to my domain were direct calls to and referred calls from the nonexistent that-page. And I don't actually know what [NC] does. And I'm unclear on the contextual meaning of the term casing in this instance, as well.
The thought occurred to me this morning, as I reviewed my visitor logs, that I have an inordinate number of hits looking for my aforementioned nonexistent page allegedly coming from a referrer that claims to be that very nonexistent page. So, I thought: "If I block the nonexistent referrer, then I won't have to block the individual random IPs that keep looking for the nonexistent page."Why bother to block at all? So what if these hits come from some page that has a non-valid link to your site. They get 404s.
ErrorDocument 404 /custom-error-page.html
Why bother to block at all? So what if these hits come from some page that has a non-valid link to your site. They get 404s.
what is "that-page" supposed to be? Is that the nonexistent webpage that is being called for by the referrers? Is it "/that-page" assuming the directory on the server or does it have to be a complete web address (mydomain.com/that-page)?I now realize I read your post too fast, and my brain inserted ${REQUEST_URI} where you actually said Referer ... but I think we're still on the right track. I assumed that the “aforementioned nonexistent page” was some specific page. And if so, you should absolutely put it in the body of the rule, since it only ever applies to requests for that page. In fact, since the page doesn't exist in the first place, why bother with conditions at all? Block them, regardless.
RewriteRule ^nonexistent-page - [R=404]
This means: return a 404 response, but do it manually without putting the server to the work of looking for the page. To the visitor, it will look identical to the ordinary 404 that they would have ended up with anyway. Normally we think of the [R] flag as various kinds of redirects, like 301 or 302, but in fact you can append absolutely any numerical code. lucy24, you need to remember that I am basically a moron.
By way of introduction, the only reason I am a webmaster is that I wanted a domain so I could have permanent email addresses and put some pictures up to be shared with friends and family.
Sorry, with all the topic drift in this thread, it wasn't clear if you've determined the hits are automated. In that case you have some blocking choices.
Read here: Blocking Methods [webmasterworld.com]