Forum Moderators: phranque

Message Too Old, No Replies

How to block botnets Hits

         

skhaw

3:44 am on Feb 29, 2016 (gmt 0)

10+ Year Member



I have recently been monitoring google analytics of our ecommerce server. We are getting several hundreds hits from botnets every day. They all come from unique ips. There is no referers. They all direct visits. Is there a way can I block them from server with zero chance of blocking real customers or crawlers like googlebot or other friendly ones? Our SEO friendly product urls are very long. So no one is going to type them in in our browser. May be it is safe to implement the block based on this assumption.

I am also puzzled the purpose behind these hits. One possible reason might be to generate click fraud on our google remarketing display ad campaigns trying to siphon few dollars a day. I appreciate any help.

not2easy

4:33 am on Feb 29, 2016 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Hello skhaw and welcome to the forums.

Problem is that generally, botnets really are actual human users and may not be aware they have "visited" your site. You might be seeing Zombie traffic - there's several discussions ongoing regarding that issue. Most recent is here: [webmasterworld.com...]

I'm not trying to dissuade you from blocking any unwanted traffic, but there is no plug-and-play way to do that, no miracle cure.

skhaw

5:07 am on Feb 29, 2016 (gmt 0)

10+ Year Member



I am no programmer or experienced webmaster. But in my case, all these links botnets are hitting are long SEO friendly urls. No human would type them in the browser address bar. So I am thinking that it should be possible to block all botnet hits to these long urls if we block every hit that is direct visit to these long urls (may be set this up based on the length of url).

lucy24

5:24 am on Feb 29, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



it should be possible to block all botnet hits to these long urls if we block every hit that is direct visit to these long urls

The server can't tell the difference between a type-in and a bookmark-- or between a genuinely referer-less request and a human who has chosen not to send a page referer. The infuriating thing about robots is that they are most easily identified by looking at what they do after the initial page request. And then it's too late; you can't retroactively take away a page they've already seen.

Study the headers-- including cookies-- for your botnet requests. You'll be surprised at what crops up.

long SEO friendly urls. No human would type them
It is not every day you find all of these words in the same paragraph ;)

tangor

5:44 am on Feb 29, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There is no zero chance of blocking humans. You can make best effort to avoid that.

Look for express traffic patterns, user agent, headers, etc to see if there is an actionable response. Unless you can identify something specific for a bot bot/net you'll just have to take it as a cost of doing business.

graeme_p

5:57 am on Feb 29, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You could try Cloudflare.

If the bots are following links then a honeypot link may allow you to identify them quickly enough to block them (any IP that follows the link gets blocked for a few hours).

Andy Langton

8:12 am on Feb 29, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



They all direct visits


Be cautious about regarding "direct" as being suspicious in and of itself. It could include (for instance) a referral from an HTTPS site to an HTTP one, users with privacy software and I see an increasing amount of search hits labelled "direct" also.

skhaw

8:16 am on Feb 29, 2016 (gmt 0)

10+ Year Member



I can take it as a cost of business if it is just 50-100 hits. But what this it is getting worse and starting to slow down the server? I am afraid DOS attack or something serious is the works. These days any of your competitor can spend a few bit coins and attack your server or empty your adwords budgets to sideline your ads. I would like to have a plan in case it happens. If we have to counter that kind of attack, I would happily block 10% of real human if I can have the site up for other 90%.

Andy Langton

8:23 am on Feb 29, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would second the suggestion to try something like Cloudflare - other than fiddling with DNS, it's pretty easy to test it out and see whether they manage to detect the bots.

skhaw

4:56 pm on Feb 29, 2016 (gmt 0)

10+ Year Member



I will definitely explore about Cloudflare. Thank you for the inputs.