Welcome to WebmasterWorld Guest from 52.87.253.202

Forum Moderators: Ocean10000 & phranque

Message Too Old, No Replies

Block wp-login.php via IP, but what about subsequent 403 error pages?

     
6:12 pm on May 5, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 5, 2009
posts:1670
votes: 329


This has been quite a long journey for myself. I've been dealing with Wordpress sites and the constant hammering of the wp-login.php. I'm quite clear about solutions using IP white listing via htaccess. I might be able to add the correct coding, but I simply don't know how this works.

From what I understand, when bots/people hammer the wp-login.php after I've blocked out all IPs except for my white listed ones, those bots will see a 403 Forbidden error page. Some bots leave or will most just continue to hammer away even when they get that 403 page?

So, am I correct that the bot can keep hammering away and will cause the server to display the error page? I'm just wondering if that's still going to cause server load issues or is this far less server intensive than what I have now, which is clear access to hammer the wp-login.php page. Is it going to be a substantial improvement? I just don't understand server loads and how the 2 scenarios compare.

I'm seeing about tidying up stats, but perhaps there is no clean way of dealing with people trying to gain access to wordpress installations. Once I've put in the htaccess IP block, will I just see an explosion in 403 pages in my stats and that's just a way of life?
6:35 pm on May 5, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


Some bots leave or will most just continue to hammer away even when they get that 403 page?

Most robots don't modify their behavior at all in the short term. Later on, the botrunner may take further action if a particular request results in a 200 instead of the usual 403/404. But in the short term the robot just works through its shopping list.

Anything is less work for the server than having to build an entire WordPress page. Unless, that is, your 403 page itself involves twenty separate database calls, eleven stylesheets and six images. Some people do pride themselves on serving a minimalist 403 response, such as no page at all and just the text "Get lost." I prefer to concentrate on misguided humans, who are the only ones who will actually look at an error document.

If you wanted to, you could divide your lockouts into things that could be honest mistakes (humans blundering into the wrong directory, or people who work at server farms surfing the web on their lunch break) and things that are inexcusable (anything involving wp-admin falls into this category). Serve a different response, such as 418, to the unwanted ones, and then you can use a different ErrorDocument or none at all.

In general, the 403 page should not show up in stats at all, because it isn't a request. There are exceptions, but we can figure these out as-and-when they arise.
6:44 pm on May 5, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 5, 2009
posts:1670
votes: 329


Okay, great thanks Lucy. At this point I'm kicking myself for being lax on this. I've seen some brute force attacks on one of my sites. I'm guessing that white listing access will stop the risk of being hacked, but it's not going to stop the visits. No such thing as set it and forget it. What a gaping security flaw. No wonder it's so popular to attack.
6:23 am on May 7, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2001
posts:5496
votes: 3


I administer a site for a friend that doe NOT use any PHP at all, thus I deny PHP requests.

If you don't use WP, than you may also write a deny for the basic 'wp' requests.

Both these are far less time consuming than dealing with IP's.

FWIW, you'll never stop them all, as the rats are endless due to the popularity of WP and the vulnerability of 3d party pulgins.
7:18 am on May 7, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


thus I deny PHP requests.

You can do this even if the site does use php, so long as the URLs themselves don't end in php. The minimalist form goes
RewriteCond %{THE_REQUEST} \.php
RewriteRule \.php - [F]
10:50 pm on May 7, 2015 (gmt 0)

Preferred Member from AU 

10+ Year Member Top Contributors Of The Month

joined:May 27, 2005
posts:442
votes: 7


You could use a new plugin available at WordPress... search for ASPS Check Referrer

In fact that plugin is also available for Drupal, Moodle and Joomla sites... available from the ArtistScope home site.
3:01 pm on May 8, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 17, 2002
posts:1187
votes: 6


Another related method is shields up / shields down.

Rename wp-login to something else and only rename it back when you want to login. That way the login page is only visible when you use it and for the rest of the time anyone hitting the login page will get a 404.
6:13 pm on May 8, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


Speaking of which... You could even code this explicitly, like
RewriteRule (wp|admin) - [R=404]

replacing (wp|admin) with any URL element that a human visitor would never legitimately see. Add a RewriteCond looking at %{THE_REQUEST} if the files are needed for behind-the-scenes work. The flag R=404 looks funny; it just means "send back an immediate 404 response without even checking whether the file exists". This can save your server a lot of work, especially in WP or any CMS that has to go through massive database calls before deciding whether something is really a 404.

That's assuming you're on Apache 2.2 or later, which I certainly hope you are. Apparently the R=number-outside-the-300-range form didn't work in 2.0. Or, at least, wasn't documented.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members