Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

User Agent Cloaking

Why its harmful?

10:29 pm on Dec 9, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 10, 2001
votes: 0

I am hearing user agent based cloaking is very bad...but i am not sure what is the reason. One thing i can guess is competitors can easily decode the cloaked page by presenting themselves as the SE bot.

Lets assume the concerned industry is not that competitive ...so there will not be people trying to analyse high ranking sites...what is the risk if user agent based cloaking is used in this case...can the SE's easily find out cloaking is used...

11:12 pm on Dec 9, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 21, 1999
votes: 0

I wouldn't describe user agent cloaking is bad or harmful. It's simply not as secure as IP cloaking for the reasons you mentioned.

Any SE can penetrate any cloak by simply surfing to a site from an "unknown IP," in other words masquerading as a typical surfer, and comparing the site's code to the code their spider retrieved. That's a resource hungry task and I haven't heard of any SE doing it routinely for a long time. I believe blatent "bad cloaking" (where a cloaked page ranks high for Disney.com but delivers the kiddies to a site designed for those over 21 years of age) still earns a ban pretty quickly at most SEs, due to user complaints.

IMHO, if you're going to the effort of cloaking (it is an effort!), use IP cloaking, and keep curious competitors out of your code. There is at least one, probably more, free scripts available.

For the past year or so, I've had better results using plain old on page optimization and I've discontinued cloaking. The SEs have evolved and become better at identifying sites relevant to a search.

More personal opinion, in a non-competitive category, cloaking just isn't needed unless your site is SE hostile. If it is, and you're in it for the long term, your efforts would be better invested in rebuilding the site.

12:36 am on Dec 10, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 25, 2002
votes: 0

Dave is quite right search engines can find cloaked material in any form really. However, if you are determined to get your site found by cloaking instead of basic optimisation then a common practice is to buy another domain name that you arn't really bothered about. Copy your existing site, place it on your new domain and cloak that. If the site is banned then it is of no effect to your current business or domain.


10:59 pm on Jan 5, 2003 (gmt 0)

New User

10+ Year Member

joined:May 20, 2004
votes: 0

Then couldnt you get both banned for duplicate content?
1:07 pm on Jan 15, 2003 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
votes: 28

He meant to cloak off one of the sites so that bots couldn't get to it - thus no dupe content.
4:49 am on Jan 29, 2003 (gmt 0)

New User

10+ Year Member

joined:Jan 24, 2003
votes: 0

how about turning on/off meta tags for search engine bots?

i have dynamically generated meta tags from posts on my forums (keywords are generated from words in that post), and to keep it loading quickly, i only have it work for non-Mozilla browsers so most people won't get it...

3:29 pm on Jan 31, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 22, 2001
votes: 0

Because meta tags aren't normally viewed by the surfer, I doubt that Google, for example, would ban you if they caught you cloaking them...

However, what good does cloaking your meta tags do anyway? Meta tags don't really do all that much for your rankings...


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members