Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: incrediBILL
Here is what I have let me know if you think itís ineffective or know of a better way.
Also if you know how to convert my Else If statments to a Select Case please tell me I tried and faild.
Iím using ASP/VBScript.
If instr(1,Request.ServerVariables("HTTP_USER_AGENT"), "Opera") > 0 then
Else If instr(1,Request.ServerVariables("HTTP_USER_AGENT"), "Mozilla") > 0 then
Else If instr(1,Request.ServerVariables("HTTP_USER_AGENT"), "MSIE") > 0 then
response.write "IS HUMAN"
response.write "IS NOT HUMAN"
A more effective approach might be to look for the platform (ie. "Linux", "Mac", "Win", etc) to indicate a 'Human'.
I think for Windows there is a public "BROWSCAP.INI" file that people are using to filter traffic on their sites - try searching for reference to it here or on Google.
Others might want to double-check this but I think you could get 95% or even 99% accuracy using such a system.
I also got "BROWSCAP.INI" working looks like an easy solution for detection but it a VERY large file so I have concerns about its speed.
Iíll keep working on it and hopefully end up with a quality function.
Just a little background info:
The site being built is part of an industry that is very dishonest and thereís a lot of copycat behavior.
So this function will load a ďRealĒ set of MetaTags for search engine bots and other non-bot visitors will see a generic set of tags.
The only critical design goal is 100% of all Major indexing spiders must be detected and feed ďRealĒ MetaTags.
I would consider the major spiders to be:
All the Web
My list of major spiders was off the top off my head and definitely needs to be refined.
Iíve never cared much about search page rankings or indexing bots and Iím finding
it to be a very complicated and controversial subject to learn.
as always input is most welcome
A human needs to read the page and digest its contents before moving to a new page.
When robots visit my website they stay less than 5 seconds per page.
Humans average 5 seconds - 90 seconds per page.
Robots are well "robotic"; they can "visit" the same page several times in one second.
Also a human will not click on invisible links.
I typically pepper my page with invisible links.
Many of them are counters and other tools.