Welcome to WebmasterWorld Guest from 54.226.58.177

Forum Moderators: Robert Charlton & goodroi

Defence against the Dark Arts

     
10:46 pm on Dec 24, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


With Google announcement that it will no longer count links against a site I hope this will help prevent further negative attacks on your site. Please use it as a basis to start discussion and collaborate to improve the coding. The code was created from people who shared code online so it only seems right to share it back.

----- Requirements ------

Cloudflare Account with Cloudflare Workers Enabled
Wordpress
Apache Server
Plesk Control Panel

Part 1 ---- Protecting the origin server from attacks ----

Cloudflare does a pretty good job of standing in front of your server but what about if someone discovers the origin server ip? They can bypass Cloudflare and attack the origin server and carry out "man in the middle" attacks.

To remedy this we are going to utilise some ip checking. And validate the user via a cookie.

1.Install mod_cloudflare on Plesk this allows ip addresses to be forwarded from Cloudflare

2.Set up Cloudflare Workers, this will cost $5 per month but is the only expense in this system

3.Add the following code to Cloudflare Workers (this is the full code not only does it protect the origin server but also image files which I'll explain later)



addEventListener('fetch', event => {
event.respondWith(fetchAndApply(event.request))
})

async function fetchAndApply(request) {


/* Set some variables some of which will be used in later on */

let referer = request.headers.get('Referer')
let useragent = request.headers.get('user-agent')
let javascript = "window.location.hostname"
let verifyvalue = btoa(javascript)
let response = await fetch(request)


/* Lets whitelist some bots we want to allow,feel free to add to list */

if (useragent != null) {
if ((request.headers.get('user-agent').includes('googlebot')) || (request.headers.get('user-agent').includes('applebot')) || (request.headers.get('user-agent').includes('bingbot')) || (request.headers.get('user-agent').includes('bing')))
{
return fetch(request)
}
}


/* We let all users through who have Cookie called "allow" they we set later */

let cookies = request.headers.get('Cookie') || ''
if (cookies.includes("allow=")) {
return fetch(request)
} else {

/* User is not a friendly bot and is probably a new visitor lets send them a cookie to allow access to our files */

response = new Response(response.body, response)
if (url.indexOf('/wp-content/') === -1) {
response.headers.set("Set-Cookie", "allow=" + verifyvalue)
return response

return fetch(request)
}

}


/* Ok We are getting suspicious now this user doesn't accept cookies lets have a look for a referer and if there is none block */
if (!referer)
{
let response = ('Please enable Cookies in your browser.... ', { status: 403 })

response = new Response(response.body, response);
response.headers.set("Cache-Control", "max-age=0");

return response;

} else {
return fetch(request)
}
}





Now in .htaccess add the following to block direct access to the origin server



# START block ips not going through cloudflare

RewriteCond %{HTTP:CF-Connecting-IP} ^$
RewriteRule ^ - [F,L]

RewriteCond %{HTTP:X-FORWARDED-FOR} ^$
RewriteRule ^ - [F,L]

<If "%{HTTP:X-FORWARDED-FOR} != %{HTTP:CF-Connecting-IP}">
RewriteRule ^ - [F,L]
</If>


# Compare Cookie sent from Cloudflare Workers (this may need further testing)


RewriteCond %{HTTP_COOKIE} .*allow=(.*?)(;|$)
RewriteRule .* - [E=COOKIE_DOMAIN:%1]


#Set this to the value of your cookie
SetEnv PROXY_CHECK "d2luZG93LmxvY2F0aW9uLmhvc3RuYW1lID09ICd3d3cuNHF1b3RlczRtZS5jby51ayc="


<If "%{ENV:COOKIE_DOMAIN} == %{ENV:PROXY_CHECK}">
</If>
<Else>
RewriteCond %{REQUEST_URI} !^/not-authorised
RewriteRule ^(.*)$ /not-authorised/ [R=302,L]
</Else>


# END block ips not going through cloudflare




Finally set up a folder called /not-authorised/ and index.php could look something like this (feel free to add your own code). This is where we will send suspicous traffic


<?php
header('Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0');
header('Expires: Sat, 26 Jul 1997 05:00:00 GMT');
?>
<head>
<meta name="robots" content="noindex" />
</head>
<body style="background:black;color:#fff;">
<center>
<iframe width="560" height="315" src="https://www.youtube.com/embed/J3UjJ4wKLkg" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<br>
<br>
<p>Please enable Javascript and Cookies in your Browser</p>

</center>
</body>



----------- End Part 1 -----------------

Next Post -> Detecting Blocking and Reporting Image Theft
7:02 pm on Dec 25, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


---- Part 2 Detecting Blocking and Reporting Image Theft ---

In part 1 we added some code in Cloudflare Workers that prevents direct access to images without a Cookie

We will now take this a step further by seeing who is serving your images via hotlinking and then reporting them


1.Set up a blank file called imagehotlink.txt below your public html directory

2.Download and install Adaptive Images for wordpress (not only will this optimise your images but we can with a little modification track who is hotlinking)

3.Modify the following file in the plugin with the code below - adaptive-images-script.php



$mysite = true;
if ((isset($_SERVER['HTTP_REFERER']) && !empty($_SERVER['HTTP_REFERER']))) {
if(!strpos($_SERVER['HTTP_REFERER'],$_SERVER['HTTP_HOST'])){

/* referer not from the same domain this is a hotlink */

$refervalue = $_SERVER['HTTP_REFERER'];
$requesturl = $_SERVER['REQUEST_URI'];
$url = $_SERVER['HTTP_REFERER'];
$urldata = parse_url($url);
$urldata = $urldata['host'];

$referrer = $_SERVER['HTTP_REFERER'];


if(strpos($url,'mysite.com') !== false){
$mysite = true; } else {
$mysite = false;
}



$my_file = '/path-to-your-file/imagehotlink.txt';
$handle = fopen($my_file, 'a') or die('Cannot open file: '.$my_file);
$data = "\n"."$refervalue".","."https://www.mysite.com"."$requesturl";
fwrite($handle, $data);

}
}

/*
if (empty($_SERVER['HTTP_REFERER'])) {

// referer blocked or empty
$refervalue = "blocked";

$requesturl = $_SERVER['REQUEST_URI'];
$my_file = '/path-to-your-file/imagehotlink.txt';
$handle = fopen($my_file, 'a') or die('Cannot open file: '.$my_file);
$data = "\n"."$refervalue".","."https://www.mysite.com"."$requesturl";
fwrite($handle, $data);


}

if($mysite !== true ){
header("Cache-Control: no-cache, must-revalidate"); header("Pragma: no-cache"); header("Expires: Sat, 26 Jul 1997 05:00:00 GMT");
die("Hotlinking not permitted");
}



You will now have a log of all sites attempting to hotlink your images and although you may not be able to claim against hotlinking you can File a DMCA for any Copyrighted text describing the image such as alt text and title text.

Filing a DMCA with Google is relatively straightforward just search for Google's DMCA Dashboard

--- End Part 2 ---

Next Post -> Dynamic Honeypot Widget
12:47 am on Dec 26, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


Apache Server
Which is to say, Apache >= 2.4 Server
1:03 pm on Dec 26, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


--- Part 3 Dynamic Honeypot Widget ---

This plugin uses Cloudflare and a fictous folder on your server to create a dynamic honeypot widget that can be added to footer or sidebar

1. Set up a blank file called scraperlog.txt below your public html directory

2. Create a folder in plugins called "simple-honeypot-using-cloudflare"

3. In the folder create a file called "simple-honeypot-using-cloudflare.php"

4. Add the following code


<?php
/*
Plugin Name: Simple HoneyPot Using Cloudflare
Version: 0.2
Author: V4Vendetta
Description: Simple honeypot script.
License: GPL3
*/

class jpen_Honeypot_Widget extends WP_Widget {
/* Set Up Widget Name and Description */

public function __construct() {
$widget_options = array( 'classname' => '', 'description' => 'This is an Honeypot Widget' );
parent::__construct( 'honeypot_widget', 'Honeypot Widget', $widget_options );
}

public function widget( $args, $instance ) {

$root = $_SERVER["DOCUMENT_ROOT"];
$title = apply_filters( 'widget_title', $instance[ 'title' ] );

/* Create the widget output. */

$title = apply_filters( 'widget_title', $instance[ 'title' ] );

/* We are going to make a dynamic honeypot from categories in wordpress */
$categories = get_categories( array(
'orderby' => 'name'
) );
shuffle($categories);
$cat_count = 0;
$cat_col_one = [];
$cat_col_two = [];
foreach( $categories as $category ) {

$cat_count ++;
$category_link = sprintf(
'<li><a href="%1$s" alt="%2$s">%3$s</a></li>',
esc_url( get_category_link( $category->term_id ) ),
esc_attr( sprintf( __( 'View all posts in %s', 'textdomain' ), $category->name ) ),
esc_html( $category->name )
);

if ($cat_count % 2 != 0 ) {
$cat_col_one[] = $category_link;
}
}

foreach( $cat_col_one as $cat_one ) {

echo $args['before_widget'];

/* I was going to allow an array of folders but decided against */

$titlevalue = $title . "#";
$titlevalue = str_replace(" ","",$titlevalue);
$titlevalue = str_replace(",#","",$titlevalue);
$titlevalue = str_replace("#","",$titlevalue);

$titlevalue = explode(',', $titlevalue);
shuffle($titlevalue);
$titlevalue = reset($titlevalue);


$cat_one = str_replace("category",$titlevalue,$cat_one);
echo "<div class=\"$titlevalue\"><ul>";
echo $cat_one;
echo "</ul></div>";
echo $args['after_widget'];
break;

}
}



public function form( $instance ) {
$title = ! empty( $instance['title'] ) ? $instance['title'] : ''; ?>
<p>
<label for="<?php echo $this->get_field_id( 'title' ); ?>">Title:</label>
<input type="text" id="<?php echo $this->get_field_id( 'title' ); ?>" name="<?php echo $this->get_field_name( 'title' ); ?>" value="<?php echo esc_attr( $title ); ?>" />
</p><?php
}


/* Apply settings to the widget instance. */
public function update( $new_instance, $old_instance ) {
$instance = $old_instance;
$instance[ 'title' ] = strip_tags( $new_instance[ 'title' ] );
return $instance;
}

}
// Register the widget.
function jpen_register_honeypot_widget() {
register_widget( 'jpen_Honeypot_Widget' );
}
add_action( 'widgets_init', 'jpen_register_honeypot_widget' );


/* hide honeypot with css */

function hook_css() {
global $wpdb;
$results = get_option('widget_honeypot_widget');
$value = array_column($results, 'title');
$cssvalue = reset($value);
$cssvalue = $cssvalue . "#";
$cssvalue = str_replace(" ","",$cssvalue);
$cssvalue = str_replace(",#","",$cssvalue);
$cssvalue = str_replace("#","",$cssvalue);
$cssvalue = str_replace(",",", .",$cssvalue);

?>
<style>
.<?php echo $cssvalue; ?> { display:none;}
</style>
<?php
}
add_action('wp_head', 'hook_css');

/* Block the honeypot in robots.txt for whitelisted bots */
function disallow_honeypots_index( $output, $public ) {

if ( '1' === $public ) {

global $wpdb;
$results = get_option('widget_honeypot_widget');
$value = array_column($results, 'title');
$revalue = reset($value);
$revalue = $revalue . "#";
$revalue = str_replace(" ","",$revalue);
$revalue = str_replace(",#","",$revalue);
$revalue = str_replace("#","",$revalue);
$revalue = explode(',', $revalue);
shuffle($revalue);
foreach( $revalue as $link){
$subfolder = $link;
$searchbot = "false";
$user_agent = strtolower($_SERVER['HTTP_USER_AGENT']);
/* Please add to list of whitelisted bots */
$bot_identifiers = array(googlebot','applebot','bingbot','bing');

foreach ($bot_identifiers as $identifier) {
if (strpos($user_agent, $identifier) !== FALSE) {
$searchbot = "true";
}
}



if ( $searchbot == "true"){

$output .= "Disallow: /$link/\n";
}
}
}
$output .= "Disallow: /your-normal-disallow-rules/\n";
return $output;
}
add_filter( 'robots_txt', 'disallow_honeypots_index', 0, 2 );



/* log the ip of the user and serve a redirect */

add_action( 'init', 'global_redirect_new_rule');
function global_redirect_new_rule() {
global $wpdb;
$results = get_option('widget_honeypot_widget');
$value = array_column($results, 'title');
$revalue = reset($value);
$revalue = $revalue . "#";
$revalue = str_replace(" ","",$revalue);
$revalue = str_replace(",#","",$revalue);
$revalue = str_replace("#","",$revalue);
$revalue = explode(',', $revalue);
shuffle($revalue);
foreach( $revalue as $link){
$subfolder = $link;



$request = $_SERVER['REQUEST_URI'];
$redirect = str_replace("$subfolder","category",$request);
$path = $_SERVER['SERVER_NAME'];
if (strpos($request, "$subfolder") !== false) {


$ip = $_SERVER["HTTP_CF_CONNECTING_IP"];
$my_file = '/path-to-your-file/scraperlog.txt';
$handle = fopen($my_file, 'a') or die('Cannot open file');
$data = "\n"."$ip";
fwrite($handle, $data);

header ("location: https://$path$redirect");
die();
}
}
}
?>



5. Enable the plugin in wordpress, in widgets drag the honeypot widget into the footer or sidebar. !Important click on the widget and give it a title of a fictional folder on your server this is used to setup css, robots and the honeypot link

6. Go to your Cloudflare account, Under Page Rules setup the rule as follows where "widget-title" is the folder you saved in the widget: If the URL matches: www.yoursite.com/widget-title/* Then the settings are: Web Application Firewall ON and Security Level: I'm Under Attack

7. Go to Cloudflare Firewall, Under Rate Limiting Setup the following rule; Rule Name: Honeypot If Traffic Matching the URL http & https www.yoursite.com/widget-title/* , from the same IP address exceeds 1 requests per 1 minute Then Challenge matching traffic from that visitor

--- END Part 3 ----

Next Post -> Protecting Comments from spambots without Captcha and Protect Login from brute force attacks
3:45 pm on Dec 26, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3511
votes: 320


Well I don't use wordpress or cloudflare or similar things, so can't follow most of this. But I do have a simple-minded question:

By my understanding, a honeypot is designed to lure and attract cyber-attackers. So why would you want to attract cyber-attackers to your site?

Also:
If Traffic Matching the URL http & https www.yoursite.com/widget-title/* , from the same IP address exceeds 1 requests per 1 minute Then Challenge matching traffic from that visitor.

Another simple-minded question: "1 requests per minute" is nothing -- What if it gets 10,000 requests per second? Can your widget keep up with that?
7:27 pm on Dec 26, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


So why would you want to attract cyber-attackers to your site?
Honeypots are often intended to attract visitors to specific pages, not to the site as a whole. For example the front page of my test site, which is comprehensively denied in robots.txt, has a bit that looks something like this:
a.honey {display: none;}
...
<a class = "honey" href = "/honeypage.html">blahblah</a>
Humans won't see the link--and neither will law-abiding robots, since they haven't seen the main page in the first place. Knowing that nobody but a malign robot will ever request the page, you know up-front that any entity making the request is up to no good. What you do with this knowledge is, as ever, up to you.
8:45 pm on Dec 26, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3511
votes: 320


Well Lucy -- I still don't understand it. Your example looks like a way to get a bad bot to reveal itsalf as such. But didn't the bot show up on its own uninvited. It wasn't lured to the site. I thought the purpose of a honeypot is to lure in bad bots so that they can be studied.
8:55 pm on Dec 26, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Oct 5, 2012
posts:903
votes: 166


User-agent: Pooh Bears
Disallow: /
9:13 pm on Dec 26, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


Aristotle, you really don't need to lure bad bots to your site. If You Build It, They Will Come--at least if it's a dot com or similar major ARIN tld. The purpose of a honeypot is then to flag the ones whose intention is something above (I should probably say “below”) and beyond the usual act of zipping past and picking up your front page, which robots do all the time.

:: detour to archived logs ::

Wow. Within the present calendar year, blocked requests for / (root) outnumber non-blocked requests by 8 to 1. (This is not a front-driven site, so nobody ever vists the root except search engines and nosy humans.) Even on a site with completely different traffic patterns, it's almost 2:1 in favor of blocked requests. I honestly had not realized the disparity would be that enormous.

But I digress.
10:32 pm on Dec 26, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3511
votes: 320


Apparently the word has more than one usage. Here is what I had in mind:
A honeypot is a computer or computer system intended to mimic likely targets of cyberattacks. It can be used to detect attacks or deflect them from a legitimate target. It can also be used to gain information about how cybercriminals operate.

You may not have heard of them before, but honeypots have been around for decades. The principle behind them is simple: Don’t go looking for attackers. Prepare something that would attract their interest — the honeypot — and then wait for the attackers to show up.

quoted from [us.norton.com ]
11:14 pm on Dec 26, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


--- PART 4 Protecting Comments from spambots without Captcha and Protect Login from brute force attacks ---

1. Create a folder in plugins called "javascript-form-protection"

2. In the folder create a file called "javascript-form-protection.php"

3. Add the following code


<?php
/*
Plugin Name: Javascript Form Protection
Version: 0.9
Author: V4Vendette
Description: Simple script to stop automated bots filling out comments now with password protection and encryption.
License: GPL3
*/

add_action( 'wp', 'encrypt' );
function encrypt($string, $key)
{
$result = '';
for($i=0; $i<strlen($string); $i++) {
$char = substr($string, $i, 1);
$keychar = substr($key, ($i % strlen($key))-1, 1);
$char = chr(ord($char)+ord($keychar));
$result.=$char;
}
return base64_encode($result);
}

function after_comment_form_submit_button( $submit_button, $args ) {

$password = get_option('formpassword');
if (empty($password)) {
$password = $_SERVER['HTTP_HOST'];
}
$data = get_comment_id_fields();
$whatIWant = substr($data, strpos($data, "value") + strlen("value"), 12);
$whatIWant = (preg_replace('/[^0-9]/', '', $whatIWant));
$stopbots = encrypt($whatIWant, "$password");
$submit = sprintf(
$args['submit_button'],
esc_attr( $args['name_submit'] ),
esc_attr( $args['id_submit'] ),
esc_attr( $args['class_submit'] ),
esc_attr( $args['label_submit'] )
);

$comment_id = get_comment_ID();
$submit_button = "<input id=\"verify\" name=\"verify\" type=\"hidden\" value=\"\" />$submit";
$after_submit = "
<script>
function protectForm() {
oFormObject = document.forms['commentform'];
oFormObject.elements[\"verify\"].value = '$stopbots';
}
window.onload = protectForm;
</script>
<noscript>THIS FORM REQUIRES JAVASCRIPT ENABLED</noscript>";
return $submit_button . $after_submit;
};


add_filter( 'comment_form_submit_button', 'after_comment_form_submit_button', 10, 2 );








function custom_validate_verify() {

$password = get_option('formpassword');
if (empty($password)) {
$password = $_SERVER['HTTP_HOST'];
}
$formid = $_POST['verify'];
$verifyid = encrypt($_POST['comment_post_ID'], "$password");

if(( $formid != $verifyid )) // do you url validation here (I am not a regex expert)
wp_die( __( "Error: Your comment could not be verified. Are you sure your not a robot?.") );
}
add_action('pre_comment_on_post', 'custom_validate_verify');



add_action('admin_menu', 'add_global_custom_options');

function add_global_custom_options()
{
add_options_page('Javascript Comment Form Protection', 'Comment Form Protection Password', 'manage_options', 'functions','global_custom_options');
}

function global_custom_options()
{
?>
<div class="wrap">
<h2>Protection Password</h2>
<form method="post" action="options.php">
<?php wp_nonce_field('update-options') ?>
<p><strong>Enter A Password To Protect Your Form</strong><br />
<input type="text" name="formpassword" size="45" value="<?php echo get_option('formpassword'); ?>" />
</p>
<p><input type="submit" name="Submit" value="Store Password" style=" background-color: #4CAF50; border: none; color: white; padding: 15px 32px; text-align: center; text-decoration: none; display: inline-block; font-size: 16px;"/></p>
<input type="hidden" name="action" value="update" />
<input type="hidden" name="page_options" value="formpassword" />
</form>
</div>
<?php
}




add_action('login_form','verify_login_field');

function verify_login_field(){
$number = "007";
$number = $_SESSION['number'];
$number++;
$_SESSION['number'] = $number;

$password = get_option('formpassword');
if (empty($password)) {
$password = $_SERVER['HTTP_HOST'];
}
$stopbots = encrypt($number, "$password");

?>
<input id="verify" name="verify" type="hidden" value="" />
<script>
function protectForm() {
oFormObject = document.forms['loginform'];
oFormObject.elements["verify"].value = "<?php echo $stopbots; ?>";
}
window.onload = protectForm;
</script>
<noscript>THIS FORM REQUIRES JAVASCRIPT ENABLED</noscript>
<?php
}



add_filter( 'authenticate', 'verify_authenticate', 10, 3 );
function verify_authenticate( $user, $username, $password ){

$number = $_SESSION['number'];

$password = get_option('formpassword');
if (empty($password)) {
$password = $_SERVER['HTTP_HOST'];
}
$stopbots = encrypt($number, "$password");

$loginid = $_POST['verify'];
if( $loginid != $stopbots ){


//User note found, or no value entered or doesn't match stored value - don't proceed.
remove_action('authenticate', 'wp_authenticate_username_password', 20);
remove_action('authenticate', 'wp_authenticate_email_password', 20);

//Create an error to return to user
return new WP_Error( 'denied', __("<strong>ERROR</strong>: Sorry you can't come in.") );
}

//Make sure you return null
return null;
}





5. Enable the plugin and find under settings in Wordpress and set a password

6. Additionally under cloudflare you can rate limit your login url

--- END PART 4 ---

Next Post -> Prevent Proxy Servers accessing your site and other sites framing content
5:33 pm on Dec 27, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


--- PART 5 Prevent Proxy Servers accessing your site and other sites framing content ---

1. To prevent framing of your website go to your Plesk Control Panel

2. Add aditional header as follows


X-Frame-Options: SAMEORIGIN


3. To prevent Proxy Servers we will use the cookie we set in cloudflare in PART 1, here's a reminder


let javascript = "window.location.hostname"
let verifyvalue = btoa(javascript)


4. So we will decrypt the cookie and use it to determine if the user window is your site or a proxy

5. If it is a proxy we will send them to our /not-authorised/ page

6. In your theme in wordpress add a new file called "session-start.php" and add the following code


<?php
$var = $_GET["status"];
$_SESSION['proxy'] = $var;
?>


7. In your header.php file for your theme we will wrap some code around it



/* Place this at the top of file */
<?php session_start();?>
<!doctype html >
<head>
/* Create a whitelist of allowed bots feel free to add */
<?php if (preg_match('/applebot|bingbot|facebookexternalhit|googlebot/i', $_SERVER['HTTP_USER_AGENT'])) { ?>

//Paste you normal header.php code

<?php } else { ?>
<script>
var test = atob('$proxycheck');
function setSession(){ if(
test == 'www.yoursite.com'
){
xhttp = new XMLHttpRequest();
xhttp.open('GET', '/wp-content/themes/YOURTHEME/session-start.php?status=no', true);
xhttp.send();
} else {
xhttp = new XMLHttpRequest();
xhttp.open('GET', '/wp-content/themes/YOURTHEME/session-start.php?status=yes', true);
xhttp.send();
}}window.onload=setSession;
</script>
<?php $proxysession = ($_SESSION['proxy']);
if ($proxysession == 'no' ){ ?>

// Repeat Paste you normal header.php code

<?php } else {
$url = 'https://www.yoursite.com/not-authorised/?cache=0';
wp_redirect($url);
exit;
};
}; ?>


9. Additionally in .htaccess you can set cookie lifetime to when browser closes

<IfModule mod_php5.c>
#Session timeout
php_value session.cookie_lifetime 0
php_value session.gc_maxlifetime 0
</IfModule>


--- END ---

If this code is of help to you please consider making a donation to Charity
If this code disrupts your negative activity please remember "It's just Business" :)
7:52 pm on Dec 27, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


@aristotle
Another simple-minded question: "1 requests per minute" is nothing -- What if it gets 10,000 requests per second? Can your widget keep up with that?


Please note this is a firewall rule so yep it will block 10,000 requests and they will never actually hit your site at all. Your other question was comprehensively answered by Lucy.
10:36 pm on Dec 27, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3511
votes: 320


Please note this is a firewall rule so yep it will block 10,000 requests and they will never actually hit your site at all. Your other question was comprehensively answered by Lucy.

Well you called it a "honeypot", but didn't explain that your usage differs from the original meaning of that word. That's what led to the confusion.

Also is that 10,000 requests from the same IP? If so, it would provide almost no protection against a real DDOS attack since those requests come from a botnet distributed over a multitude of IPs. Actually it doesn't appear to be of enough use to be worth the trouble of setting it up.
10:58 pm on Dec 27, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


@aristotle haters gonna hate.... You don't seem to understand "Firewall" as well as "Honeypot" - I am done with u
12:48 am on Dec 28, 2018 (gmt 0)

Junior Member

5+ Year Member

joined:Oct 28, 2011
posts: 48
votes: 2


Hey @V4Vendetta,

Thank you for sharing this.

I haven't had a chance to implement this yet, or dig into it fully - but, I am always looking for any way I can to protect my site from DDOS attacks, slow-speed attacks, bounce-rate attacks, and attackers of any kind.

I've been relying on fail2ban and some WordPress hacks so far, but have held out on implementing Cloudflare for too long, and I think it is time to put it in place.

Regardless of what Google announces, I still believe that negative-SEO is possible.

I don't have any proof of this, or data to back my assertion - so this just my intuition and gut feeling. From my understanding of how machine-learning and artificial intelligence works, AI algorithms are very much a 'black-box'. Google may have designed the algo to not allow negative-SEO, but at some point the AI takes over and goes in its own direction, regardless of how it was designed.

Even if bad links or novel new attack methods don't bring down your website and affect your rankings --- a good-old fashioned DDOS still can do both...

cr1t1calh1t
1:05 am on Dec 28, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


a real DDOS attack since those requests come from a botnet distributed over a multitude of IPs
Matter of fact, a DDOS attack may not come from anywhere. That is, if DOS is their sole motive, and you’re the target*, they'll probably be sending an array of fake IPs, and you'll never know where they “really” originated. But unless they are extra intelligent--mercifully, most robots aren't--all those requests will have identical headers, so you'll be able to identify something.

The good news is that DDOS attacks really aren't very common unless you've got a big high-profile site--or you live on a big, high-profile server--because there has to be something in it for them. As robotic motives go, shutting down your server is pretty far down the list.


* If someone else is the target, the distribution would go in the other direction: bogus requests sent out to a variety of sites, all with the same fake originating IP.
1:01 pm on Dec 28, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3511
votes: 320


I am done with u

LOL
5:19 pm on Dec 28, 2018 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:4122
votes: 261


How are whitelisted robots being verified? It may be shown somewhere here that is not obvious to non-php people so I'm just curious. I know I see fake Googlebots frequently as well as other bots claiming to be what they aren't.
6:50 pm on Dec 28, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15381
votes: 727


I know I see fake Googlebots frequently as well as other bots claiming to be what they aren't.
That's worthy of a thread on its own: robots that are habitually spoofed. Mercifully the most popular is Googlebot, which has a narrowly delimited crawl range* so you can easily slam the door on the fakers. I also get a lot of fake BaiduSpiders (don't remember the exact capitalization)--which is pretty funny, since I block the real thing so they don't gain anything by using a fake ID. And then there's the robot using the name of a bot operated by an established WebmasterWorld member, only the one visiting my site isn't his, and I hope the real one never comes calling because I've found it simpler to deny it by name.

I'm surprised more malign robots don't use the names of the best-known distributed robots, since those can't be denied on wrong-IP alone. (Wrong headers, maybe.)


* Remember a few years ago when G terrified everyone by saying they would start crawling from ranges in other countries? afaik nothing ever came of it, unless they limited these activities to sites hosted in selected non-English-speaking countries.
7:26 pm on Dec 28, 2018 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:4122
votes: 261


A thread on its own would be in the Search Engine Spider and User Agent Identification [webmasterworld.com] forum. ;)

But yes, I wondered whether the headers were being checked. Prompted by a new batch of fake Googlebots from Choopa/Vultr servers. They were blocked for the 45.63.0.0/17 IPs - also used HTTP/1.0 for requests.
9:39 pm on Dec 28, 2018 (gmt 0)

New User

joined:Dec 24, 2018
posts:12
votes: 10


How are whitelisted robots being verified?


Good question, Cloudflare routinely block spoofed useragents of search engines so there is no need to reverse dns yourself. I don't know if there is a full list of useragents Cloudflare checks anywhere.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members