Forum Moderators: goodroi
Can anyone shed any light on what they think the visitors' intentions were in searching for a root/admin.php file.
Thank you.
===BEGIN===
- - [08/Mar/2008:15:38:23 +0000] "GET //admin.php?include_path=p4n93r4nk0d0k/yhe.txt? HTTP/1.1" 302 419 "-" "libwww-perl/5.808"
- - [08/Mar/2008:15:48:46 +0000] "GET /html/some-dir/index.php?entry=60//admin.php?include_path=/id.txt? HTTP/1.1" 301 505 "-" "libwww-perl/5.808"
- - [08/Mar/2008:15:48:47 +0000] "GET /index.php?entry=60//admin.php?include_path=/id.txt? HTTP/1.1" 200 12693 "-" "libwww-perl/5.808"
===END===
[edited by: goodroi at 2:17 am (utc) on May 10, 2008]
[edit reason] Please no specific URLs [/edit]
SetEnvIfNoCase User-Agent libwww-perl getout
SetEnvIfNoCase User-Agent "Indy Library" getout
SetEnvif Request_URI (robots\.txt¦custom-403-error-document\.html)$ permit
#
# (Note: only one unconditional Order directive per .htaccess file)
Order Deny,Allow
#
Allow from env=permit
Deny from env=getout
# Rule to allow serving robots.txt, custom 403 error page, and bad-bot script to bad-bots
RewriteRule (robots\.txt¦custom-403-error-document\.html¦bad-bot\.pl)$ - [L]
#
RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC]
RewriteRule .* - [F]
Robots.txt must be accessible even if you don't use a bad-bots script; Some robots (even good but undesired ones) will interpret an inaccessible robots.txt file as carte-blanche to spider the entire site, and you want to prevent a flood of denied requests from them by asking them to go away nicely using robots.txt.
Replace all broken pipe "¦" characters above with solid pipes before use; Posting on this forum modifies the pipe characters.
Jim