Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
Beware that useragent spoofing is simple, and "others" could find the pages too.
Sounds like to iffy of a proposition without good lists.
I get a kick out of all the non-cloaking "doomsayers" who think it's certain death to cloak a page. Nonsense. The people that fear cloaking the most(and are usually most vocal against it) are the ones who don't understand how it works. Granted, without the script I use, I wouldn't cloak. But understanding how and what it does...I can cloak with assurance that if in fact I do get busted, somebody went to great length to do so.
Without such service, one would have to create the list manually. There are a few resources available to start an initial list, however as toolman points out, only frugal review of the log files and utilization of reverse lookup will be the way to keep data current.
An intelligent script might be able to include new agents and addresses on the fly. If carefully written, it could cross reference existing data and compare such items as dns names, user agent and ip addresses. So by example, if a spider utilized a new UA and the ip address was already recorded, then it would be inserted as a new spider UA. If a new ip address was detected, but the UA matched...then a simple class "c" match with existing addresses could be used determine its authenticity.