Forum Moderators: open
Here is what I have let me know if you think it’s ineffective or know of a better way.
Also if you know how to convert my Else If statments to a Select Case please tell me I tried and faild.
I’m using ASP/VBScript.
If instr(1,Request.ServerVariables("HTTP_USER_AGENT"), "Opera") > 0 then
Else If instr(1,Request.ServerVariables("HTTP_USER_AGENT"), "Mozilla") > 0 then
Else If instr(1,Request.ServerVariables("HTTP_USER_AGENT"), "MSIE") > 0 then
response.write "IS HUMAN"
Else
response.write "IS NOT HUMAN"
End If
End If
Cheers
mossimo
A more effective approach might be to look for the platform (ie. "Linux", "Mac", "Win", etc) to indicate a 'Human'.
I think for Windows there is a public "BROWSCAP.INI" file that people are using to filter traffic on their sites - try searching for reference to it here or on Google.
Others might want to double-check this but I think you could get 95% or even 99% accuracy using such a system.
I also got "BROWSCAP.INI" working looks like an easy solution for detection but it a VERY large file so I have concerns about its speed.
I’ll keep working on it and hopefully end up with a quality function.
Just a little background info:
The site being built is part of an industry that is very dishonest and there’s a lot of copycat behavior.
So this function will load a “Real” set of MetaTags for search engine bots and other non-bot visitors will see a generic set of tags.
The only critical design goal is 100% of all Major indexing spiders must be detected and feed “Real” MetaTags.
I would consider the major spiders to be:
Lycos
Altavista
Webcrawler
Northern Light
Excite
All the Web
Direct Hit
Google
Hotbot
Go
Slurp
My list of major spiders was off the top off my head and definitely needs to be refined.
I’ve never cared much about search page rankings or indexing bots and I’m finding
it to be a very complicated and controversial subject to learn.
as always input is most welcome
mossimo
A human needs to read the page and digest its contents before moving to a new page.
When robots visit my website they stay less than 5 seconds per page.
Humans average 5 seconds - 90 seconds per page.
Robots are well "robotic"; they can "visit" the same page several times in one second.
Also a human will not click on invisible links.
I typically pepper my page with invisible links.
Many of them are counters and other tools.