homepage Welcome to WebmasterWorld Guest from 54.166.65.9
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

    
SEO Cloaking Primer
A fresh cloaking primer
volatilegx




msg:676105
 7:01 pm on Jun 15, 2004 (gmt 0)

A lot of people seem to like Air's post [webmasterworld.com], but I've always found it skirted the real purpose of cloaking as it relates to SEOs.... cloaking to obtain better search engine rankings.

Air's example uses the language setting of the browser to selectively serve content in different languages. SEOs are more interested in selectively serving content based on the identity of the requestor.

Here's a simple example of how SEO style cloaking works:

You have a cloaked web page on your server. This 'web page' is actually a CGI script which reads the IP address of the requestor. It compares the IP address of the requestor against a list of IP addresses known to belong to search engine spiders. If a match is found, then the requester is identified as a search engine spider. If no match is found, the requestor is identified as a 'human'.

The CGI script can then selectively serve different content based on the identity of the requestor. If the requestor is human, it can serve the home page of the domain, or some other web page that is not highly optimized. If the requestor is a search engine spider, it can serve a highly optimized web page.

The point of all this is that highly optimized web pages are served to search engine spiders while human visitors get a different web page... all from the same URL.

I have used IP addresses in my example as the method of identification, but the User-Agent string can be used as well (with less reliability).

Now that the mechanics of cloaking have been discussed, I'll get into the risk-management issues. Cloaking is risky. Search engines have publicly stated that they frown on cloaking, and that websites can be punished (with penalties or banning) if caught.

To keep your risks to a minimum, there are several strategies you would be wise to practice: keep your cloaked pages on a separate domain from your primary website, maintain accurate and up-to-date IP lists of the search engine spiders, and keep the cloaked pages out of the search enginges' caches.

What is meant by keeping web pages on a separate domain? Well, you already have a primary website... it is the site you are trying to promote in the search engines. The last thing you want to have happen is to have this site penalized. Get another domain name and host it on a different server. Preferrably, the whois information for this new domain should be different from your primary domain as well. One way to accomplish this legally is to register the new domain under a D.B.A. and have the address be a post office box. This strategy keeps a buffer between your primary domain and your cloaked domain. If this domain is ever penalized or banned for having cloaked pages, your primary domain won't be affected.

The second strategy involves the IP addresses your cloaked pages use to identify search engine spiders. You need to maintain a complete and accurate list of IP addresses. Search engine spiders are always crawling the web from new IP addresses and your cloaked pages need to be able to recognize the new IP addresses. Also, sometimes search engines abandon old IP addresses and these need to be removed from the list.

Many search engines offer a cached copy of a web page they have spidered. The cached copy defeats the purpose of cloaking by showing any human requestor the highly optimized version of your cloaked page. Make sure that you use the ROBOTS NOARCHIVE meta tag on your optimized version of the cloaked pages and exlude any search engines that do not follow this convention.

Finally, let's discuss an important question, "Should you do it?" There is an ethical slant to this question as well as a practical one.

Many folks claim cloaking is sneaky and unethical. Their claim is that you are lying to the search engines about the content of your website in an attempt to get better rankings under keyword phrases the website doesn't deserve. The other side of this argument is that cloaked web pages are trying to provide relevent results to search engines... after all, what webmaster wants irrelevent traffic flooding their server?

The other slant is practicallity. How much time is it going to take you to set up and maintain a cloaking system? Will the traffic it brings be worth the cost of the software (or the time programming your own script) and the cost of the new domain and hosting?

[edited by: volatilegx at 7:21 pm (utc) on June 15, 2004]

 

bakedjake




msg:676106
 7:07 pm on Jun 15, 2004 (gmt 0)

Very cool.

I'd like to add that, in addition to caches, any search engine service should be accounted for, including shopping search, contextual advertising services, translation services, etc.

I've seen some cloaking sites use a reverse approach lately - serve the cloaked content unless the person can be verified as a real human user. It's an interesting approach if done correctly.

Remember tier-2 engines as well. A popular cloaking detection method is to go to Gigablast and check its cache for a cloaked site. Some people opt to serve the user-version to these tier-2 engines, however, defeating that approach.

volatilegx




msg:676107
 7:14 pm on Jun 15, 2004 (gmt 0)

That's right, I forgot to mention excluding the IP addresses translation services such as Alta Vista's Babelfish and Google's translation service.

Another thing I forgot was proxies. Google maintains a WAP proxy, by which those using WAP appliances can use a Google IP address to access web pages. It's wise to exclude these types of proxies from your list of IP addresses.

caine




msg:676108
 7:22 pm on Jun 15, 2004 (gmt 0)

Personally i would have thought the use of cloaking to deal with language variations and content visibility for SEO would be an optimum solution - certainly has my interests.

Herath




msg:676109
 9:03 pm on Jun 16, 2004 (gmt 0)

Reading these threads put a big? in my head.
Can a simple strategy of blocking countries based on IP address be miss judged as cloaking?

For example if visitors IP is Nigeria , Singapore or Sri Lanka we serve a 'Sorry we don't ship to your country' page. I wonder if this could be a potential SEO problem for us in the future.

johnser




msg:676110
 12:28 am on Jun 17, 2004 (gmt 0)

>>>ROBOTS NOARCHIVE

Note that GGuy said several months ago that 95% of sites using this tag were cloaking so should there ever be a wholesale crackdown on "cloaking", if you use this tag, its a red flag which is crying out for closer inspection.

If you can figure out techniques to not display your cache without using the "noarchive" tag, that may be better for you long-term.

dwilson




msg:676111
 10:40 am on Jun 17, 2004 (gmt 0)

keep your cloaked pages on a separate domain from your primary website

Sorry to be dense, but how in the world does that help your primary domain? if MyCloakedSite.com ranks super-well for "high-priced widgets", how does that bring people to MyPrimarySite.com? If redirects are the answer, didn't you just lose the disconnection between the two that the separate domain & WhoIs info created?

ppg




msg:676112
 12:00 pm on Jun 17, 2004 (gmt 0)

> If redirects are the answer, didn't you just lose the disconnection between the two that the separate domain & WhoIs info created?

Guess so, but presumably if a SE were to ban sites on that premise, it would be pretty easy for people to knock out competitors sites by cloaking them from another domain...?

volatilegx




msg:676113
 6:08 pm on Jun 17, 2004 (gmt 0)

Sorry to be dense, but how in the world does that help your primary domain? if MyCloakedSite.com ranks super-well for "high-priced widgets", how does that bring people to MyPrimarySite.com? If redirects are the answer, didn't you just lose the disconnection between the two that the separate domain & WhoIs info created?

Some cloaking software packages use perl's LWP module to grab a page from your primary site (like the home page) and display it under the URL of the cloaked page. This technically isn't a redirect, and since it all happens server-side, there's no indication to the viewer (the human) at all.

hotwheel




msg:676114
 11:36 pm on Jun 17, 2004 (gmt 0)

It is always funny how seo vs engines follow the same path as criminals versus police.

You may be able to keep a step ahead for a while, but is it worth it? Do you really want to wake up everyday and wonder if they are on to me yet?

Every reputable and noteworthy search marketer has the same outlook - if a search engine did not exist, would you still do it?

So if all you folks are seriously considering cloaking, fine. You may get a head of the crowd temporarily, but long term it is just an unhealthy form of search marketing that will not pay off.

BTW - the automated indexing technology may never catch you, but don't think for a minute that your competitors can't report you and the powers that be can't act upon that information. Site bans are done manually.

Marcia




msg:676115
 1:38 pm on Jun 18, 2004 (gmt 0)

>>an unhealthy form of search marketing that will not pay off.

It sure would pay off as a defensive measure when content is being swiped.

>>don't think for a minute that your competitors can't report you and the powers that be can't act upon that information.

That's automatically assuming that the pages that get reported are actually in violation of the guidelines; they may not be at all. And then the powers that be may just decide to take a closer birds-eye look at those particular searches and do a little surfing, and there's no guarantee that the competitor's site is clean as a whistle. People violate guidelines every day of the week without even realizing they are.

As a matter of fact, how about taking a little side trip down memory lane to this discussion:

Cloaking Gone Mainstream [webmasterworld.com]

bakedjake




msg:676116
 2:36 pm on Jun 18, 2004 (gmt 0)

if a search engine did not exist, would you still do it?

Maybe not. But that's not reality, is it?

If cars didn't exist, I probably wouldn't learn how to drive, either.

There are some instances where cloaking is absolutely necessary. I'm talking about doing it for sites where the technology (generally CMS) that they've used is so far screwed up that they can't get anything but the homepage in the index. What's the downside to cloaking there? What is there to lose? Nothing.

Besides, as far as your competitors that might be cloaking to "hide their code", for the good ones, you'll probably never figure out that they are cloaking.

Every reputable and noteworthy search marketer

Not true. There's a few well-known search engine marketers that swear by cloaking.

LeoXIV




msg:676117
 7:22 pm on Jun 18, 2004 (gmt 0)

This topic is quite interesting, I read this thread and the previous 13 pages!

For my website I implemented IP/Country based display of prices which i assume goes under the category of legal cloaking. However I can say that since doing this the rankings have sharply decreased in Google! which is somewhat surprising to me because Google encourages 'relevant results' so i assumed if somebody in Japan is seeing the prices in Yen is more relevant than seeing in US$.

May be i should make all the pages "Robots NOArchive"?!

btw if Google forbids illegal (AND/OR) legal cloaking, just type in Google 'cloaking' and watch the lengthy array of sponsored listings that comes up, which sounds quite paradoxical with their rather eagle eyed editorial process, which i think we are all familiar with :)

volatilegx




msg:676118
 10:06 pm on Jun 18, 2004 (gmt 0)

LeoXIV,

May be i should make all the pages "Robots NOArchive"?!

This wouldn't help in your situation because you aren't really cloaking for SEO reasons. You are cloaking for geographic reasons, which is a different animal.

You might want to examine other possible factors as to why your ranking went down. Your version of cloaking doesn't usually trip any kind of alarm with Google.

LeoXIV




msg:676119
 12:33 am on Jun 19, 2004 (gmt 0)

That's true volatilegx.

I was just thinking how Google might classify these different types of cloaking. May be they expect that everybody does Geo targeting as they do! as far as i have seen in Google's own Geo targeting always the urls are unique. for example what is www.google.com/page1.htm, in the UK is www.google.co.uk/page1.htm. Maybe if the same URL as a result of IP change produces different content, in the eyes of Google violates the integrity of their DBs and their algorithms?

kovacs




msg:676120
 2:56 pm on Jun 20, 2004 (gmt 0)

if a search engine did not exist, would you still do it?

If search engines didn't exist then I'd be out of a job. :D

mykel79




msg:676121
 8:20 pm on Jun 21, 2004 (gmt 0)

as far as i have seen in Google's own Geo targeting always the urls are unique

That's not quite true. You can choose "go to google in English" and see google.com (and not google.pl for example) in any country. But you still won't see ads that are targeted towards the United States.

astoller




msg:676122
 3:23 pm on Jun 22, 2004 (gmt 0)

volatilegx

Would all this cloacking technology be unnecessary if a client
accepted a basic site with , say 1 nice top graphic?

Alternativerly, why couldn't we just place a div tag near the top of the page
that holds our optimised text and a link to our optimsied site map.

Then further down the page we keep
all the spohiticated graphics etc.
So the engine would hit the div tags first and crawl all our optimsied areas of the site
before having to bother with
all the exisiting rollovers, javascript etc that adds sophisitication.

volatilegx




msg:676123
 4:25 pm on Jun 23, 2004 (gmt 0)

astoller,

You can go that route, and the search engines will be more than happy to index you.

However, what happens when a competitor views the source of your page and copies your techniques?

Keeping your competitors from viewing your optimized source is the real purpose behind cloaking as used by SEOs. Cloaking does not give you a better chance of getting a good ranking. It merely hides your optimized page from human eyes (including your competitors).

Cloaking is primarily a defensive technique designed to protect your HTML from your competitors' eyes.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved