Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

how do you detect (competitive) cloaking?

12:31 am on Jul 8, 2000 (gmt 0)

Senior Member

WebmasterWorld Senior Member rcjordan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 22, 2000
votes: 0

I've seen threads where someone will ask if such-and-such site is cloaked. Someone will come back and answer that it is. How are these guys breaking into the code (if they are)? Are there cloak-sniffing scripts?


4:54 pm on July 8, 2000 (gmt 0)

Inactive Member
Account Expired


You can get clues by looking at the title and description returned by the query on an engine and compare that to the meta's that you get when you follow the link.
You can also look at the size of the file and see if that differs from the returned page.
If you notice a difference it is possible that the page was cloaked, it is also possible that the page was switched once the high ranking was achieved, or just good old honest updating that the engine hasn't caught up with yet.
Other giveaways can be many different listings that all point to the same page, or many subdomains that bring you to the same index page. Although these would not necessarily confirm that a site is cloaking but they are indicators.
I don't believe that there is any way to break into the code or any cloak sniffing script, maybe Brett or Air can confirm that.
I believe that cloaking is nearly impossible to detect if it is done well, and easy to spot if done poorly. In between those two extremes I think all that you can do is make an educated guess.


6:14 pm on July 8, 2000 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 10, 2000
votes: 0

I don't believe that there is any way to break into the code or any cloak sniffing script, maybe Brett or Air can confirm that.

For cloaking scripts that have plugged the typical holes this is true, unless IP spoofing if employed, but that is neither simple or easy to achieve, and carries considerable personal risk. So in all practical terms Q's statement stands.

There are cloak sniffing scripts that only work against scripts that serve cloaked pages based on User Agent, but cloaking scripts that use UA alone are either remnants of days gone by, or sometimes people will do it because they want maximum search engine detection over code protection.

As Q (as in 007?) points out, those methods are the most typical ways people use to determine whether a page is cloaked.

To those you can add:
-Use the translator at AV, and because it carries IP's in AV's range, some scripts give up the cloaked page. (This is the method used to uncover Green Flash recently).

-Ditto for Infoseek/GO

-if you are familiar with the structure of some cloaking scripts you can simply navigate to the cloaked page with your browser, if they haven't protected against it.

-Same thing with those servers that have been set up to return "INDEX OF /" (they list out the directory structure) whenever a directory without a default page is requested. Again you then simply navigate to the cloaked pages.

-Use a cloak sniffing script, and if they are serving by User Agent, voila, the page is revealed. ( there are a couple of these around (littleman has one, and I believe Brett has one at searchengineworld.com).

-Then there is the biggest threat to security of any system, and that is the people running the site the script is installed on. If they were to put your IP address in their script because they believed you were a spider from one of the search engines, then net time you visit their site you would get served the search engine pages. The evils of personal spiders carrying convincing search engine agent footprint ...


4:42 am on July 10, 2000 (gmt 0)

Inactive Member
Account Expired


I found a couple in Google who made a simple error - they only put the cache-stopping code on the cloaked pages. The pages hadn't been cached and yet the code wasn't on the pages. :)

You can also take a unique sentance from the page you see and use it as a search term - that page should come up tops unless of course the spider (and therefore algo) doesn't see that version of text at all. :)

There are other ways too, but if I told ya I'd have to kill ya :)



Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members