Welcome to WebmasterWorld Guest from 54.227.104.40

Forum Moderators: Ocean10000 & incrediBILL & keyplyr

Message Too Old, No Replies

Should blank user agents be blocked or not?

     
2:43 pm on Apr 7, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3317
votes: 253


I'm currently trying to improve the .htaccess files for my websites, and am wondering if I should remove some code that I added some time ago for blocking blank user agents. I don't remember exactly where I got it, but the code is as follows:
# BLOCK BLANK USER AGENTS
RewriteCond %{HTTP_USER_AGENT} ^-?$
RewriteRule ^ - [F]

But in an article about blocking user agents at [perishablepress.com...] , I found the following statement:
To accommodate Facebook (and others) traffic, the empty/blank user-agent is no longer blocked, which is unfortunate because of its effectiveness at blocking bad requests.

So the question is, should I remove this code from my .htaccess files or not?
4:02 pm on Apr 7, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2001
posts:5463
votes: 3


Each webmaster must determine what is beneficial or detrimental to their own site (s).

I deny Blank UA's as do most others (it's a simple procedure to add exceptions to the same rule).

Personally, I deny the FB bots as well as FB Refers, however that's my preference.
7:28 pm on Apr 7, 2014 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:May 14, 2008
posts:3170
votes: 8


I block blank UAs. I only see FB hits with a UA. Possibly they are blocked for using an already-blocked IP range such as amazon.
7:33 pm on Apr 7, 2014 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:10434
votes: 602


I block blank UAs, but allow several ranges through that are beneficial (FB and a few others) YMMV.

RewriteCond %{HTTP_USER_AGENT} ^-?$
RewriteCond %{REMOTE_ADDR} !^xxx\.xxx\.xxx\.xxx
RewriteCond %{REMOTE_ADDR} !^xxx\.xxx\.xxx\.xxx
RewriteCond %{REMOTE_ADDR} !^xxx\.xxx\.xxx\.xxx
RewriteRule !^robots\.txt$ - [F

Note: replace the "xxx" with actual ranges you wish to allow to use a blank referrer. This is site specific and different sites will see different agents as beneficial or not.

Also, I allow all agents access to robots.txt, even ones I block for various other reasons.
7:47 pm on Apr 7, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3317
votes: 253


keyplyr wrote:
I block blank UAs, but allow several ranges through that are beneficial (FB and a few others)

This seems like a good approach to me. Would it be too much to ask (or at least give a hint) which ranges should be allowed and what would be the best way to do it?

Edit: keyplyr I apologize. For some unknown reason, I initially only saw the first line of your post and didn't see the rest of it until now.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members