Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
"Over the next couple of months we're going to be looking at putting some new policies in place and looking at some new technical measures that will basically try to spot cloaking automatically," Cutts said. "We're trying to sound the alarm to people and let them know that they should more or less stop cloaking now."
The only reason why cloaking works in the first place is that spiders use different IP addresses and user agents than people.
So I guess the question is not wheather or not a bot could look like a human, but rather would the SE's do what it would take to truely emulate a surfer.
If I were doing a search engine, I'd be working on this right now. To begin with, I'd have human review of the to-be-banned sites. But after perfecting the cloaking-spider, I'd just let it run.
"...I will say that the google way is to try to do everything automatically...Any time you have a person in the loop is slows things down..."
So it looks like they will indeed be setting up an automatic "seek and destroy" machine. At this point I wouldn't put it past them to make a 'tricky' 'smart' 'human-like' spider.
Sounds like they hired Dr. Evil! He's going to steal my mojo!
I expect them to randomly select 1% of the pages in their database and check them for cloaking each month. No referrers. No multiple hits on the same site. No walking links. Just a one-page access within a minute of an indexing-spider hitting the page coming in from a random AOL IP address. The AOL hit would come before the indexing-spider hit.
Rapidly changing dynamic content would get penalized, but maybe that isn't the best thing for a spider to index, anyway. If it can't find the same words that it indexed when it comes in from AOL, then that page probably shouldn't be in the index anyway. They might allow some kind of markup, say a <div class="dynamic">, tag around some parts of the page that wouldn't get indexed at all for dynamic content.
Sorry if this makes you mad. But a SE is not in the business of making it easy to get good results for a specific site. They are in the business of providing relevant results to someone using their engine. If they have to penalize some legit sites to provide relevant results, so be it. There is an easy way around the issue: don't cloak to that search engine.