Forum Moderators: phranque
Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:22.0) Gecko/20100101 Firefox/22.0
This worm arrives on a system as a file dropped by other malware or as a file downloaded unknowingly by users when visiting malicious sites.
It executes commands from a remote malicious user, effectively compromising the affected system. It connects to a website to send and receive information.
BrowserMatchNoCase x86_64 bad_bot
Order Deny,Allow
Deny from env=bad_bot
BrowserMatchNoCase x86_64; bad_bot
Order Deny,Allow
Deny from env=bad_bot
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} x86_64;
RewriteRule .* - [F]
[edited by: phranque at 2:15 pm (utc) on Mar 31, 2014]
[edit reason] disabled graphic smileys [/edit]
Non alphanumeric characters should be escaped with \
Order Deny,Allow
Deny from env=bad_bot
BrowserMatch GoogleBot keep_out Order Deny,Allow Order Allow,Deny Agent: Opera/9.80 (Windows NT 6.2; Win64; x64) Presto/2.12.388 Version/12.15
The first version of the botnet was mainly involved in denial-of-service attacks and email spam, while version two of the botnet added the ability to steal Bitcoin wallets, as well as a program used to mine bitcoins itself.[2][20] Its spam capacity allows the botnet to spread itself by sending malware links to users in order to infect them with a Trojan horse, though later versions mostly propagate over social network sites, in particular through Facebook.
You don't want
Order Deny,Allow
(Others do, but I'm tolerably certain you don't.) That's the whitelisting format. It's for very large websites that can afford to lose some visitors-- especially sites that are attractive enough that humans will try to make contact and ask for exemptions. Typically with this you say "Deny from all" and then add a short specific list of people to allow.
Host: 79.143.191.206
/
Http Code: 200 Date: Mar 30 15:28:53 Http Version: HTTP/1.1 Size in Bytes: 37392
Referer: http://www.example.com/
Agent: Opera/9.80 (Windows NT 6.2; Win64; x64) Presto/2.12.388 Version/12.15
/page2.html
Http Code: 200 Date: Mar 30 15:28:54 Http Version: HTTP/1.1 Size in Bytes: 15453
Referer: http://www.example.com/
Agent: Opera/9.80 (Windows NT 6.2; Win64; x64) Presto/2.12.388 Version/12.15
Do you happen to know (or does anyone know) if it's practical to create a whitelist of allowed user-agent strings and block everything else.
SetEnvIf User-Agent Opera keep_out
SetEnvIf User-Agent Ubuntu keep_out
RewriteCond %{HTTP_USER_AGENT} Linux [NC]
RewriteCond %{HTTP_USER_AGENT} !Linux;\ U;\ Android [NC]
RewriteRule .* - [F] But in my case, being on a shared server
# Block User-Agent Strings
SetEnvIf User-Agent attach ban
SetEnvIf User-Agent "Advanced Email Extractor" ban
SetEnvIf User-Agent BlackWidow ban
SetEnvIf User-Agent "Sqworm/2.9.85-BETA" ban
SetEnvIf User-Agent Bot.mailto:craftbot@yahoo\.com ban
SetEnvIf User-Agent ChinaClaw ban
SetEnvIf User-Agent "OPR/20.0.1387.77" ban
SetEnvIf User-Agent 20100101 ban
Order Allow,Deny
Allow from all
Deny from env=ban
Mozilla/4.0 (compatible; MSIE 6.0; MSIE 5.5; Windows NT 5.0) Opera 7.02 Bork-edition [en]
Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; FunWebProducts; .NET CLR 1.1.4322; PeoplePal 6.2)
Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; MRA 5.8 (build 4157); .NET CLR 2.0.50727; AskTbPTV/5.11.3.15590)
Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727)
Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322)
Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; chromeframe/19.0.1084.52)
Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:22.0) Gecko/20100101 Firefox/22.0
Mozilla/5.0 (Windows NT 5.1; rv:13.0) Gecko/20100101 Firefox/13.0.1
Mozilla/5.0 (Windows NT 5.1; U; en) Opera 8.01
Mozilla/5.0 (Windows NT 6.1; rv:5.0) Gecko/20100101 Firefox/5.02
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.52 Safari/537.36
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.63 Safari/537.36
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.71 Safari/537.36
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.72 Safari/537.36
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:21.0) Gecko/20100101 Firefox/21.0
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:22.0) Gecko/20100101 Firefox/22.0
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:5.0) Gecko/20100101 Firefox/5.0
Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.116 Safari/537.36
Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.116 Safari/537.36
Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.71 Safari/537.36
Mozilla/5.0 (Windows NT 6.2; WOW64; rv:21.0) Gecko/20100101 Firefox/21.0
Mozilla/5.0 (Windows NT 6.2; WOW64; rv:22.0) Gecko/20100101 Firefox/22.0
Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/28.0.1500.52 Chrome/28.0.1500.52 Safari/537.36
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.22 (KHTML, like Gecko) Ubuntu Chromium/25.0.1364.160 Chrome/25.0.1364.160 Safari/537.22
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410.63 Safari/537.31
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.110 Safari/537.36
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.52 Safari/537.36
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/28.0.1500.52 Chrome/28.0.1500.52 Safari/537.36
Mozilla/5.0 (X11; Linux x86_64; rv:21.0) Gecko/20100101 Firefox/21.0
Mozilla/5.0 (X11; Linux x86_64; rv:22.0) Gecko/20100101 Firefox/22.0
Mozilla/5.0 (X11; U; Linux x86_64; C) AppleWebKit/534.3 (KHTML, like Gecko) Qt/4.6.2 Safari/534.3
Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:14.0; ips-agent) Gecko/20100101 Firefox/14.0.1
Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:21.0) Gecko/20100101 Firefox/21.0
Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:22.0) Gecko/20100101 Firefox/22.0
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:20.0) Gecko/20100101 Firefox/20.0
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:21.0) Gecko/20100101 Firefox/21.0
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:22.0) Gecko/20100101 Firefox/22.0
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:22.0) Gecko/20100101 Firefox/22.0
Opera/9.80 (Windows NT 6.2; Win64; x64) Presto/2.12.388 Version/12.15
So I'm wondering if it would be possible to catch a lot of them by blocking old versions of Firefox and Chrome, but don't have enough knowledge at this point to know where the version cutoffs should be or the details of how to do it.
Firefox/[12]\b
Mozilla/[0-3]
MSIE [1-4]\.
Opera [3-9]
RewriteCond %{HTTP_USER_AGENT} MSIE\ [56]\.\d [OR]
RewriteCond %{HTTP_USER_AGENT} Chrome/[1-8]\.\d [OR]
RewriteCond %{HTTP_USER_AGENT} Firefox/(3\.[0-5]|[567])
Firefox/1?\d\b Host: 66.249.85.24
/
Http Code: 200 Date: Apr 05 23:56:55 Http Version: HTTP/1.0 Size in Bytes: 9215
Referer: -
Agent: Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0 Google favicon
/favicon.ico
Http Code: 200 Date: Apr 05 23:56:55 Http Version: HTTP/1.0 Size in Bytes: 70
Referer: -
Agent: Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0 Google favicon
This looks like a Google favicon fetching bot that includes Firefox/6.0 in the user-sgent string. So is this one old version of Firefox that shouldn't be blocked? Does anyone know?
BrowserMatch ^-?$ keep_out why Google's favicon bot also fetched the home page just before it got the favicon.