Forum Moderators: open
Session Log:
A visitor from cosmo.cs.dartmouth.edu (129.170.213.140) arrived from informant.dartmouth.edu/, and at 3:24:37 AM on Sunday, January 28, 2001. This visitor used The Informant.
2001-01-17 06:10:59 cosmo.cs.dartmouth.edu - GET /index.html 200 0 13242 172 300 HTTP/1.0 The+Informant - [informant.dartmouth.edu...]
2001-01-20 04:42:48 cosmo.cs.dartmouth.edu - GET /index.html 200 0 13242 172 371 HTTP/1.0 The+Informant - [informant.dartmouth.edu...]
It annoys me and I am considering banning the UA and IP of the Informant server.
Anyone else doing this?
<script language="JavaScript">
// IP Redirect
var ip = '<!--#echo var="REMOTE_ADDR"-->' // ** Do Not Modify This Line **
if (ip == '00.00.00.00') {
alert("ACCESS DENIED!");
if (confirm("You are not authorized to view this web site!"))
{location.href="http://www.anywhere.com/security.html" }
else
{ ("ACCESS DENIED!");
{location.href="http://www.anywhere.com/security.html" }} }
</script>
Replace 00.00.00.00 with their IP address.
Replace anywhere.com with your domain. Create a short security.html page to redirect them to. I like to place this same code on the bottom of the page they are redirected to. This will cause their browser to hang and they will have no choice but to Ctrl+Alt+Del to close the browser. Works with IE and NS. Haven't tried it with other browsers.
Haven't had a surfer savvy enough to bypass it yet although it would not be dificult for someone to do.
<script language="JavaScript">
// IP Redirect
var ip = '<!--#echo var="REMOTE_ADDR"-->' // ** Do Not Modify This Line **
if (ip == '00.00.00.00') {
alert("ACCESS DENIED!");
if (confirm("You are not authorized to view this web site!"))
{location.href="http://www.anywhere.com/security.html" }
else
{ ("ACCESS DENIED!");
{location.href="http://www.anywhere.com/security.html" }} }
</script>
:)
Not every webmaster has a server that allows SSI, PHP- some don't even allow CGI-BIN. Like I said, it is a nice way of banning a surfer from one area of a website while allowing them to visit other areas. I know a couple of webmaster's that utilize freebie boards that use this script. It's better security than none at all.
The script could be rendered useless by disabling javascript but most of your everyday internet surfers don't even know what javascript is, much less how to disable it.
[httpd.apache.org...]
Example:
SetEnvIf User-Agent ^Webzip
<Directory /www3/icehousedesigns/>
Order Allow,Deny
Allow from=all
Deny from env=Webzip
</Directory>
I don't think the <directory > directive works in .htaccess files... you should use the <files > directive instead.
See [webmasterworld.com...]
1) no support for client side script such as JS
2) ignore, or have ways to ignore robots.txt
3) change user agent identity
For spider/bot developers, getting the desired information is normally all that matters. Many don't bother with robots.txt and similar things.