Forum Moderators: DixonJones
I'm wondering if it would be wise to block HTTP/1.0 requests from non-SE user agents? I know how to do this with mod_rewrite in .htaccess, though I'm concerned I may block legit users and not just some script kiddie's spoofing - and badly behaved - robot.
Speaking of DOS attacks, someone using MSIECrawler - which I have blocked - has been trying to use that on my site for almost 7 hours now. Had to remove custom 403 page (only 300 bytes) from .htaccess because the idiot kept GETting that over and over, and has used almost half a gig of BW. Some people just don't know the meaning of "Access Denied"...
Speaking of DOS attacks, someone using MSIECrawler - which I have blocked - has been trying to use that on my site for almost 7 hours now. Had to remove custom 403 page (only 300 bytes) from .htaccess because the idiot kept GETting that over and over, and has used almost half a gig of BW. Some people just don't know the meaning of "Access Denied"...
Have you ever thought aout that this might be an automatic update of favourites that are available offline? I suppose then IE will try and try again, since it is available if the user visits the site "manually"
Have you ever thought aout that this might be an automatic update of favourites that are available offline?
Yes. I know what it does but have always had it blocked. So it couldn't be an "update" of anything.
I suppose then IE will try and try again, since it is available if the user visits the site "manually"
If by "manually" you mean the pages the person has downloaded to the desktop using MSIECrawler, it still doesn't make sense as they could've never downloaded anything with it to begin with except 403 page.
So someone who doesn't get that the application they are using is not allowed on a particular site, but still attempts to use it for hours on said website is, in fact, creating a DOS-type situation as it makes the server slower for other users. I'm just glad my web host has mod_throttle installed or it could have been worse.