homepage Welcome to WebmasterWorld Guest from 50.19.206.49
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

    
A Realistic Look at Some of the Risks of Cloaking
volatilegx




msg:677747
 4:21 pm on Jan 21, 2003 (gmt 0)

This post/article assumes you know what website cloaking is and how it works. If you don't, check out Air's excellent Cloaking Primer here [webmasterworld.com]. There are several risks worth mentioning when talking about cloaking, some of which can get a site into a bit of trouble.

The biggest risk, and the greatest fear of any SEO, is that their website will be banned from a search engine if they are caught cloaking. Several search engines, including Google, have publicly stated that they will ban a site if they catch it cloaking. There are at least two ways to reduce or eliminate this risk.

First, you can make the content shown to the search engines closely resemble the content shown to surfers. Using this method, you just optimize the pages for search engine review, stripping out unnecessary tables, CSS, graphics, etc. The body copy of the page should be the same or very similar. This technique makes the human/spider pages similar enough so that you might not be banned if a human reviews it and realizes that no unethical cloaking was taking place.

A second solution is to host your cloaked pages on a different server than your main website and use a different domain name. If the cloaked pages are banned or penalized, your main website won't be penalized along with them. However, if an employee of a search engine is reviewing your cloaked pages and notes that they all link to your main website, or that they are all forwarded to your main website in some way, he may do further research to find the domain name of your main website and penalize it.

Note that neither solution completely eliminates the risks of being penalized or banned. Both are measures to give a little bit of "insurance" to your cloaking efforts. My recommendation is that you use both techniques.

Another risk many cloakers are concerned with is that their competition will find out they are cloaking and either attempt to decloak their pages to steal their optimized code, or report them to various search engines in an effort to get the site penalized.

You can prevent someone from decloaking your cloaked pages by using the IP address recognition method of cloaking. Also, use the meta noarchive tag for your cloaked pages. In many cases, this tag will keep search engines from "cacheing" your cloaked content. To completely eliminate risk of being decloaked via a search engine cache, you would need to eliminate any search engines that cache (that includes Google) from your database of IP addresses. Most cloakers also eliminate the IP addresses of known translators, such as Alta Vista's BabelFish translator.

If the descriptive listings in the search engine results do not match what is on the page they link to, a reader might become suspicious of cloaking. One way to avoid this is to have identical meta description tags for the human and spider pages. Since the meta description tags aren't usually used in search engine algorithms, you can safely do this without affecting rankings.

One of the most pervasive, but least talked about, risks associated with cloaking is the temptation to over-optimize your cloaked pages. Many cloaked sites are banned/penalized not for cloaking, but because they used spammy SEO techniques in their cloaked spider pages that would get anybody banned.

Many novice cloakers use the same template for each of hundreds of cloaked pages, or they use a commonly used template like WebPosition Gold's. If you are going to use cloaking software to generate hundreds of pages, make sure the pages aren't too similar to each other. You need to vary the size (in kilobytes) of each page, vary the text used on the pages, etc. Also, use sentences and paragraphs that actually make sense, not random words thrown together in an attempt to create a certain keyword density.

Linking patterns are another important matter that is often overlooked. Many cloaking software packages will generate hundreds of template based cloaked pages with just a few mouse clicks... don't be tempted to use the templates to create a "web" of interlinked, cloaked domains, all funneling traffic to one central site. Certain engines, such as Google, can recognize such linking patterns and will penalize accordingly. This strategy can get the cloaked domains penalized (probably not actually banned). However, sparse to moderate linking is OK.

Cloaking is of course a risky venture, but then any business involves a certain level of risk. The trick to success is learning how to manage the risks and gain the maximum benefits available.

 

johnhamman




msg:677748
 4:37 pm on Jan 21, 2003 (gmt 0)

Programicly what are some good things that we need to take to avoid risks. (at least the risks that can be avoided thru programing.) Please list these things that can be accomplished universaly with most web languages (i.e. asp, cgi/perl, asp.net, etc.)

volatilegx




msg:677749
 4:46 pm on Jan 21, 2003 (gmt 0)

Well technically speaking, I'd say the main thing would be to keep an up-to-date list of the IP addresses of search engine spiders. There are a number of resources for this, both commercial and non-commercial.

Another thing to be concerned about is the method of redirection... There are many ways to serve the different content displayed to humans and search engines.

Some scripts simply display one local file for search engines and another for humans. This method is effective and quick, but not very flexible or user friendly.

Some use templates for the optimized text. This is pretty user friendly, but unless the template system of the program is pretty sophisticated, the pages tend to look an awful lot alike.

Some scripts send a "redirect" code to the browser, telling the browser to look elsewhere for the human page. This method is not suitable at all for cloaking, because the URL of the page will change in the location area of the browser.

One effective method to use is to have the script make a HTTP request to grab the "human" code... This is nice because you can set up a cloaking script on one domain, and when a human comes along to look at a cloaked page, they could see a page from a different domain (such as your primary home page). Since it's the script making the HTTP request, processing could be done on the HTML before it is displayed, such as adding a BASE HREF tag to make sure graphics and links work properly, etc. Also, the URL in the browser doesn't change.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved