Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
I just would like to confirm that such a method when used via an HTML form with post method for visitors alone is considered cloaking (while known spiders can index pages normally). Once the visitor clicks the first button he can browse the site normally. Spiders on the other hand do not go through that first step and browse the site without scrutiny.
I'm sure this will eliminate rogue bots or automated scripts. Although a bot can submit forms without a problem it will be unlikely to start debugging css styles so a form can be crafted in a way, to ensure a human is present and be simple at the same time. But of course if it results having a site removed from the spiders is of no use.
Did anyone ever tried it and if so what's their experience? TIA.
If you do that however, you will tick off people coming to your site via SERPs to direct pages.
Yes sessions are already used to post forms, create accounts etc, otherwise you only view pages.
But the thing is spiders will go through without scrutiny while humans the first time will have to click the button. I can set it up to use an ip or a cookie as signature for sometime so this won't be repeated for subsequent accesses.
This approach I don't know if it will cause spiders to detriment the value of the site or even remove it from their index.
However, as you're describing it you'll be violating Google's 1st page free rule, meaning you should see the 1st indexed content page before being hit with a login or anything else.
After viewing one free page you can block them for login, but not before.
Another approach against scraping perhaps, will be to allow the humans to view a content page if and only if, it is first indexed by the popular spiders otherwise give them a 302. It should at least give the site admin the lead for the content.