Forum Moderators: DixonJones

Message Too Old, No Replies

HTTP/1.0 requests

not SE bots, Windows UAs

         

coyote

7:48 am on Mar 5, 2004 (gmt 0)

10+ Year Member



For the past few days I've noticed a very large number of HTTP/1.0 requests coming from various IPs, all using Windows/IE user agent strings.
I've never noticed a normal browser use HTTP/1.0 before, only various bots/crawlers. The requests are strange, first GETting random pages (some root, some deep, generally not directly linked to one another) then coming back later for images. Also never any referrer info nor loading external js, css, or flash files.

I'm wondering if it would be wise to block HTTP/1.0 requests from non-SE user agents? I know how to do this with mod_rewrite in .htaccess, though I'm concerned I may block legit users and not just some script kiddie's spoofing - and badly behaved - robot.

Speaking of DOS attacks, someone using MSIECrawler - which I have blocked - has been trying to use that on my site for almost 7 hours now. Had to remove custom 403 page (only 300 bytes) from .htaccess because the idiot kept GETting that over and over, and has used almost half a gig of BW. Some people just don't know the meaning of "Access Denied"...

WebJoe

9:44 am on Mar 7, 2004 (gmt 0)

10+ Year Member



Speaking of DOS attacks, someone using MSIECrawler - which I have blocked - has been trying to use that on my site for almost 7 hours now. Had to remove custom 403 page (only 300 bytes) from .htaccess because the idiot kept GETting that over and over, and has used almost half a gig of BW. Some people just don't know the meaning of "Access Denied"...

Have you ever thought aout that this might be an automatic update of favourites that are available offline? I suppose then IE will try and try again, since it is available if the user visits the site "manually"

coyote

3:01 am on Mar 8, 2004 (gmt 0)

10+ Year Member



Have you ever thought aout that this might be an automatic update of favourites that are available offline?

Yes. I know what it does but have always had it blocked. So it couldn't be an "update" of anything.

I suppose then IE will try and try again, since it is available if the user visits the site "manually"

If by "manually" you mean the pages the person has downloaded to the desktop using MSIECrawler, it still doesn't make sense as they could've never downloaded anything with it to begin with except 403 page.

So someone who doesn't get that the application they are using is not allowed on a particular site, but still attempts to use it for hours on said website is, in fact, creating a DOS-type situation as it makes the server slower for other users. I'm just glad my web host has mod_throttle installed or it could have been worse.

py9jmas

12:35 pm on Mar 8, 2004 (gmt 0)

10+ Year Member



I'm wondering if it would be wise to block HTTP/1.0 requests from non-SE user agents?

Squid proxy server is HTTP/1.0 only. All requests coming through Squid will show up as HTTP/1.0.

This doesn't sound like what you're seeing, but it is something to consider if you block these requests.

Jon.