| 12:48 am on Mar 20, 2008 (gmt 0)|
You're going to be very unhappy if you block that user-agent string, since it represents Internet Explorer on Windows XP, with the .NET 1.1 and 2.0 runtimes installed -- A very common profile.
You'll need to look for some additional criteria to block these guestbook postings, or you risk losing a fairly big number of legitimate visitors.
Some additional criteria to investigate: Remote IP address, IP address range, or ranges.
GeoIP lookup country/countries
HTTP_Referer (if non-blank)
As for the code, try searching this forum for recent posts on the same or similar subjects. We'll be happy to help you get your own code working, but unfortunately, we don't have enough contributors to support a "free coding service" in this forum.
| 1:03 am on Mar 20, 2008 (gmt 0)|
I don't get many visitors anyway and i just want these lowlife to know that they arn't welcome on my site
| 2:10 am on Mar 21, 2008 (gmt 0)|
i have ended up password protecting my home folder and making up a 401 message
| 6:15 am on Mar 21, 2008 (gmt 0)|
Does anyone know why some of these brainless morons who own these search robots keep on trying to access a site even tho they are permanantly banned ?I would have thought that they would eventually notice that they are banned and take that url out of the bot program.I have a moron from cuil.com keep on trying.I even sent the guy an email telling him that he was banned.Some people are too brainless i guess
| 4:43 pm on Mar 21, 2008 (gmt 0)|
You're thinking too much in terms of "people" -- You are dealing with automated programs, and should code accordingly.
Make sure your robots.txt is correctly-coded to Disallow Cuill's Twiceler robot, and make sure that robots.txt is accessible to all robots, good or bad, and without a password or any other restrictions. If, despite robots.txt exclusion, you get a request from a Disallowed robot, make sure you return a 403-Forbidden response.
Further, if you use a custom 403 error page, make sure that that page is accessible to all requestors, good or bad, to avoid putting your server into an 'infinite' loop.
Also, a comment on the previous part of this thread: When you block a "bad guy" do not give him any information whatsoever, other than the 403-Forbidden response. If you tell him why or how he's blocked, he may fix his script to get around your block. If you taunt him, he may subject your site to a DOS attack. Information is power; Do not empower your enemies.
"Revenge is a dish best served up cold."
| 12:27 am on Mar 22, 2008 (gmt 0)|
but people run these automated programs and they should be keeping an eye on what they do.But as i said before i don't get many visitors to my web page and i prefer to tell the ones that do want to have a look the login details.
| 5:29 am on Mar 22, 2008 (gmt 0)|
> but people run these automated programs and they should be keeping an eye on what they do.
Doubtful: They play the numbers game, and far more sites are totally unprotected than protected. They simply don't care if a thousand pages out of 12 trillion forbids their access. (12 trillion pages indexed by Google as containing the word "the" as of today.)
| 8:11 pm on Mar 22, 2008 (gmt 0)|
"far more sites are totally unprotected than protected" you are right there.There is some info about local dog owners on one of the local government sites that google has found.Now if i rang that government agency and asked for that info i would get a stern no.
"They simply don't care if a thousand pages out of 12 trillion forbids their access." Well that is obvious.!
| 11:23 pm on Mar 22, 2008 (gmt 0)|
These search robots could almost be called a virus
| 3:06 am on Mar 24, 2008 (gmt 0)|
|These search robots could almost be called a virus |
they are generally better known as botnets ;)
| 7:42 am on Mar 24, 2008 (gmt 0)|
some of them behave like a virus.!
| 9:55 am on Mar 24, 2008 (gmt 0)|
owner of twicler is the latest brainless MORON who is TRYING to access my site.no one else is