Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
I'm thinking about trying IP-based cloaking. Is there any other benefit besides hiding the spider content from SEOs?
Using my current method, I realise that smart SEOs could see my spider content if they wanted to, but I don't really mind since it's 100% readable.
The benefits of UA/nocache cloaking is that it is fairly simple to setup and maintain. However, like the previous poster mentioned, most SEOs will just surf as a UA and find out what you are up to.
If you don't mind smart SEOs reading your optimized code, then why bother cloaking at all?
Because I want certain visitors to see 1 pg and other types of visitors to see another.
I don't have much to hide from SEO's anyways. I don't think there are too many secrets for on page stuff.
Thanks, you guys confirmed what I thought.
Is there any other benefit besides hiding the spider content from SEOs?
It is my opinion that the search engines routinely change their User Agent to one that us humans use just to check for cloaking (especially if they're doing a manual review of the page). So, eventually if you're using UA cloaking you're going to "get caught" with your hands in the cookie jar. It's just a matter of time.
That is very well possible. But can't a search engine just as easily change the IP of their bot without SEOs knowing? Also, my guess is that a person doing a manual review from inside the plex has the ability to see what the bot sees. (IP based)
Also, for my situation, the cloak is only temporary in most cases.
Connecting through a proxy means you would see a totally different IP and would have NO IDEA who is actualy viewing your content.
...just do IP cloaking and you'll be fine, but don't be too surprised if you eventually get caught.