Forum Moderators: open

Message Too Old, No Replies

Gecko user agent

Language property missing

         

GaryK

6:35 pm on Aug 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Mozilla/5.0 (Windows; U; Win98; rv:1.7.2) Gecko/20040803

Lately I've been seeing a lot of Gecko-based user agents like this where the language property is missing.

Based on my logs it appears to be a regular browser.

Question: Is this a valid UA and should I include this variation in my browscap.ini file?

wilderness

2:40 am on Aug 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Gecko is a version of Netscape
http ://developer.netscape.com/software/communicator/ngl/

jdMorgan

3:19 am on Aug 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That's the latest Mozilla 1.7.2 browser. See [mozilla.org...]

The language preferences will be sent in the HTTP request header, all of which is not visible in your logs. Here's an example of the Accept_Language header:

HTTP_ACCEPT_LANGUAGEen-us,en;q=0.5

This shows that I prefer U.S. english, followed by any other english.

Jim

GaryK

4:29 am on Aug 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thank you.

Maintaining an accurate browscap.ini file is getting more and more difficult. I'm beginning to wonder if user agents still serve any useful purpose. Especially since they're so easily spoofed/modified.

In this week's analysis, out of 100+ unique user agents all but 12 were spoofed/modified by the user. My favorite was: "Bill Gates in person" from 172.186.24.103, an AOL IP Address.

To keep this somewhat on topic I had a spoofed YahooSeeker user agent and something called "Googelbot/Beta (+http://www.googlebot.com/bot.html). Note that Googlebot is spelled incorrectly.

It makes me wonder if the 2-3 hours I spend each Sunday morning creating an updated browscap.ini file for free download is worth it.

Is this an appropriate forum to discuss this issue?

DanA

9:05 am on Aug 23, 2004 (gmt 0)

10+ Year Member



While I think it's not worth (User agent switchers, Multizilla, Firewalls, Anonimizers(?),...) I think your list is an interesting resource as I find sorted UAs there

GaryK

5:10 pm on Aug 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks for your reply. I am still left wondering if user agents are rapidly becoming obsolete when you can't even trust the user agents from what appear to be legitimate search engines - until you check the IP Address and by then it's too late as the damage has been done. I've had phony search engine user agents do what amounts to a dDOS attack on my servers. I try to block them by IP Address but often the IP Address is dynamic and changes from visit to visit.

jdMorgan

5:42 pm on Aug 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Blocking by user-agent is an important part of a minimal three-part defense; It is much more efficient than blocking by IP address, though less effective. However, IP blocking can take care of some spoofed user-agents, though not those attacks that use open proxies. Finally, blocking by behaviour is the third method, but that can't be done until *some* accesses have been allowed in order to analyze behaviour.

While some of these methods -- indeed, all of these methods combined -- are not bulletproof, they work pretty well in combination because of the simple fact that there are more unsophisticated than sophisticated attacks. Therefore, even unsophisticated access controls block the majority of problems, and help to keep your bandwidth down.

I've stated it before, but it bears repeating occasionally: If you block even the most obvious attacks, then the total number of attempts seems to decrease over time. If your site is wide-open to abuse, then the number of attempts goes up over time. So, even less-than-perfect methods pay off.

While big corporate sites can afford to install 'smart' firewalls that updtate themselves with fee-based dynamically-maintained databases of intruder IP addresses, that option is typically not available to the small independent Webmaster. So even the simple three-part method described here is useful.

In short, keep up the good work, GaryK. Maybe put a few AdSense ads up on that page, and get some reward for you efforts!

Jim

DanA

5:48 pm on Aug 23, 2004 (gmt 0)

10+ Year Member



Phony robots and phony users.
It's really difficult to trust any user agent (I have a whole lot of fakes -funny ones, logspam, stupid messages...- and Linux UAs - its browsers offer completely configurable UAs-) and as most ill-intentioned have dynamic IP addresses or use proxies such as Google, now I only clean the URL, removing anything that shouldn't be there (even from legitimate buggy search engine robots) and eventually limit the number of pages from the same IP per second (not reliable, even legitimate webbots change host and do not follow robots.txt rules)
Analysing host names might help...

wilderness

6:15 pm on Aug 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



funny ones, logspam, stupid messages...

and even forum links.

I've been dealing with those with

RewriteCond %{HTTP_REFERER}

I also use these for Usenet archive referrals.

Although it doesn't omit the log entries, the result in most instances puts a damper on the traffic.

GaryK

7:32 pm on Aug 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks for the words of encouragement Jim.

I guess there is still a place for user agents, at least for now.

It's funny you should mention AdSense because I applied on Saturday. I was going to start soliciting donations but AdSense seems more professional and less greedy on my part. I don't want to make any money off of this project, but I do want to cover my out of pocket expenses like bandwidth.

Now you've got me wondering if I should consider distributing a free list of evil IP Addresses. I have thousands of them on file but never thought they were worth compiling into a file for distribution. If it would help the small webmaster who cannot afford the professional service you mentioned it would be one more thing I could do to help all of us.