Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
So back to the question. I have only heard of BrowserHawk as a tool for detecting bots. Do you think that this is the best choice? and what is used by the leaders? is there a way I can test if they are using it?
I would like to do the same as them, as we are in the same business and at least I need to compete at the same level.
If not browserhawk can u suggest me how to do it? any opensource alternatives? any custom code?
Any help would be very appreciated..
Anyway, I also read in other sources and they are suggesting SpiderSpy by fantomas.
I would have to pay 258$ every year, but at least seems they give a complete and updated list of IPs to which I can tweak the page (for good intents of course).
Would you trust this list?
Do you think google or the other big search engines would be able to go past this list of IPs authomatically if they want to test for cloaking?
I am not much worried about manual checks, as if the techniques used have good intents and the reviewer common sense the site shouldn't be banned (but yes it's guess work to know how tollerant they are). I'm only talking about automatic ways implemented by google to detect cloaking.
joined:Apr 15, 2008
[edited by: incrediBILL at 5:57 am (utc) on Sep. 15, 2008]
[edit reason] no specifics, see TOS [/edit]
Using the user-agent or reverse-DNS lookup method to force a particular language or currency setting for a search engine robot is not maliciously deceptive.