Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Bot traffic designed to lower user experience metrics?

         

Sgt_Kickaxe

5:22 pm on Aug 9, 2021 (gmt 0)



- New site launched in April, fully indexed, 45 articles and just starting to receive Google traffic.

Someone started running a script against the site last week which is loading every page a single time every 6-9 hours, each time bouncing in 0.4 seconds. Random user agents, 8 different versions of the Chrome browser(always Chrome) and always from a Windows operating system. While the operating system is always Windows 10 the visits are reporting themselves as being from Windows 2000, 7, 10, android 10 and Linux. Of course the IP is different every cycle and the referrer is 80% blank and 20% a non-https Google url, which doesn't actually exist..

Were the site not so new it would be hard to detect. The fact it reports itself as all different versions of Chrome suggests it's not just trying to hide but to report bad metrics back to Google. I'm letting it do its thing vs the site just to gather more data, the site owner is more interested in knowing who is doing it than how or why.

Regardless, have you discovered this type of metrics targeting bot activity recently and how did you nuke it? What happened?

Sgt_Kickaxe

5:45 pm on Aug 15, 2021 (gmt 0)



.... and then there were two. Bot one is continuing its daily activities and bot two is loading the wp-login wordpress page exactly 84 times per day, 6 days in a row.

It's not a wordpress site. Are new sites being overly targeted because they are weak? It just seems like a lot of effort vs a site too new to threaten other sites, yet.

aristotle

12:52 am on Aug 16, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



What you describe has the characteristics of a botnet. The first time I saw one in the logs of one of my sites, years ago, I thought it might be the preparations for a DDOS attack, but so far all of them have remained at far to low a level of activity to have any real effect. A serious DDOS attack would involve 1000s of hits per second.

Most likely your site is only one of many that is being targeted by these botnets. They may be new botnets that are in the process of being expanded, and every time a new device is added, all the other devices already in the net are rechecked as well. At any rate that's the best explanation that I've ever been able to come up with.

Edited for clarity

FranticFish

4:54 am on Aug 18, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If a bot or person is directly requesting login / admin pages then that at least is something that you can take action on.

For pages that only you have any business using, you could implement an IP address whitelist - this is a common feature on hosting CPs these days. I'm not at all familiar with WordPress but I imagine there are plugins to allow you to define one or two IPs that are allowed to login, and block the rest.

Just in case you're not aware, even if you have a fixed IP your ISP can change these without warning. This has happened to me more than once over the years. I have both me and my backend dev whitelisted so if one of us is accidentally locked out then the other can restore access easily.

Pjman

1:37 pm on Aug 18, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



If you use CDN like Cloudflare, just monitor the signature of the bot and put a rule set in to block that traffic. I highly suggest blocking traffic from countries that would have zero interest in your site.

I had a major problem on two of my sites with bot traffic on my ecom site (constant brute logins to everything - DDOS - whatever would work for them- competitor just looking for trouble). I blocked all (Tor traffic and any country that never bought a product from me in the previous 4 years). Presto Chango! Not a single incident in 3 years now.

I have an ad supported site with great rankings. I noticed a huge spike in bot single page (less than 3 second visits) over a month. I analyzed traffic and found the stream of countries they were using. Again blocked all traffic from anyone who had no interest in my site over the last 4 years. Bang! All traffic normal and rankings held for many years now.

aristotle

9:12 pm on Aug 18, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



.... and then there were two. Bot one is continuing its daily activities and bot two is loading the wp-login wordpress page exactly 84 times per day, 6 days in a row.

It's not a wordpress site.

For a non-wordpress site, it's fairly simple to block these nuisance probes. Here's some .htaccess code that blocks access to some of the most commonly-requested wordpress files:
# BLOCK FILES
<FilesMatch "^(wp-config.php|update.php|xmlrpc.php|wp-login.php|license.txt)">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>

As for the other activity that the OP describes, I still believe that it's coming from a botnet. But it's harmless as long as the level of activity stays low. Usually you can block many of these requests, but in my opinion this involves more time and trouble than it's worth.

robzilla

12:02 pm on Aug 19, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I agree it's mostly harmless, certainly don't worry about this sort of thing "lower[ing] user experience metrics" (what made you think of that anyway?). Maybe it's a scraper, screenshotter, link checker, or something like that. Difficult to block, but at that hit rate I wouldn't even bother looking into it.

Sgt_Kickaxe

11:41 pm on Aug 19, 2021 (gmt 0)



"Lower[ing] user experience metrics" (what made you think of that anyway?

What made me think of it is the newness of the site and its lack of backlinks. It's so new it's near impossible to find, you'd have to know where to find new sites, like 50+ in the rankings or some new site reporting service.

I'm not worried about it but hadn't seen this type of thing in a while.

aristotle

1:20 am on Aug 20, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This kind of activity could affect user metrics if the hits were coming rapidly enough to slow the site down. But as a rough guess, that would porbably require 100s of hits per second, depending on the server's capabilities and the size of the requested files.

Sgt_Kickaxe

3:57 am on Aug 21, 2021 (gmt 0)



# BLOCK FILES
<FilesMatch "^(wp-config.php|update.php|xmlrpc.php|wp-login.php|license.txt)">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>

Good advice but always test this after adding to your htaccess file.

Some hosts don't simply block a user, they redirect them to a default parked page on your domain which you don't want to be interlinking to.

aristotle

7:22 pm on Aug 21, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Actually that "# Block Files" code I posted is out of date, because other wp-files are currently more often requested than those particular ones. Hackers are continually changing which vulnerabilities they look for and I haven't kept up nearly as well as I should with my sites. So if you want to use that code, you should check your error logs to determine which files to include.