Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Should I take the plunge?



9:40 pm on May 15, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I've always been nervous about cloaking. I can see the benefits (esp for preventing pagejacking, which is starting to become a serious irritation) but can't afford to risk a ban.

There's a lot of FUD surrounding cloaking - what's the real level of risk?


9:46 pm on May 15, 2001 (gmt 0)

WebmasterWorld Senior Member agerhart is a WebmasterWorld Top Contributor of All Time 10+ Year Member

finally someone who is on the same wavelength as I am......


9:48 pm on May 15, 2001 (gmt 0)

WebmasterWorld Senior Member agerhart is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Cloaking, and everything that comes with the package, sounds great, but doesn't it have to be monitored very carefully so they don't catch on?

I have been weighing the benefits and risks myself, and I can't seem to come to a decision.

you should check this out though: http://webmasterworld.com/forum24/212.htm [webmasterworld.com]


10:02 pm on May 15, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

> can't seem to come to a decision

I know the feeling... I've got a spare IP and several 'disposable' domains that I'm thinking of trying cloaking out with, but I'm still not totally sold on implementing it with any 'live' sites.

> check this out

Yep, great post, but I'm still a total coward :)


10:07 pm on May 15, 2001 (gmt 0)

WebmasterWorld Senior Member agerhart is a WebmasterWorld Top Contributor of All Time 10+ Year Member

>>>>Yep, great post, but I'm still a total coward :)

Believe me I know what you are talking about, that is why I am such an advocator of themes......they're safe.

If I were you I would try it out with that extra IP that you have laying around. I mean, what can you lose?

I think we need some advice from the experts around here.


10:27 pm on May 15, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

> I mean, what can you lose?

A precious spare IP ;)

Although that is the reason I got the extra IPs - so I could take a few risks...

But really, is there any serious risk of IP banning unless you push it too far?


10:49 pm on May 15, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I haven't seen any troubles with cloaking at all except the paranoia that goes with it. After awhile I guess you get used to the pucker factor and things level off. Keep up on the updates for the ip's and you'll be just fine.

SK you have a terrific understanding of the way a cloaking script should work so the obvious little snafus I learned the hard way shouldn't hinder you at all.

  • Always remember to "build" the cloaked page in the actual position it will be located in the web site. Then go into the folder on your hard drive and move the page into the script's folder where it will become cloaked. Do this and you won't have tell tale links in the index.html file that look like '../../Google/index.html' anywhere on your page. If you move it inside of your editor you risk an easily overlooked mistake of the editor updating the links to the files as you move the page. This will get you busted big time
  • Make the cloaked page look just like the human pages and even consider the file size if you're in a cut throat category. Use css on the cloaked page to "move things around."
  • If you're ranking good already...you must be able to write good copy. The best thing cloaking will do in this situation is allow you to "tweak" for different engines without affecting the other rankings. Actually this is the best thing about cloaking.
  • WebGuerrilla

    11:03 pm on May 15, 2001 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member

    The first thing to evaluate before taking the plunge, is what exactly are your reasons for wanting to cloak?

    Usually, the people who come to me wanting help with cloaking fall into one of these four types.

    1. Never Had It, Never Will

    Your site is a programming masterpiece. Full of great content, completely dynamic, and extremely appealing to humans. The only downside is you have never had a single visitor arrive at your site from a search engine. Redesigning the site isn't an option because your conversion rates are quite good.

    For sites like these, there are no downsides to cloaking. The threat of getting banned from a database you've never been in in the first place isn't something you should spend much time worrying about.

    2. You've Got It, But You Might Loose It

    You first built your site in 95. Slowly, over time you became the most relevant and dominant site in your niche. You get a great deal of traffic, but that old, static site has become far too difficult to maintain. It's time to move to a database driven system with all kinds of cool content mangement and ecommerce tools. URL's are going to change, and the overall site navigation will be changing as well.

    For situations like this, cloaking is also an ideal solution. If you do nothing, you will loose the search engine traffic you worked long and hard for. If you cloak, you can preserve the old site (and listings) while at the same time seamlessly route human visitors to your new content.

    3. It's Payback Time

    Like situation #2, you were the first one to move into your neighborhood. When you got started, there were about 50 competing web pages in your niche. Now there are 50,000. All of a sudden, everytime you sit down to do a little competitive research, you find some new johnny-come-lately using chunks of your code. Everything from the wording of your H1 tags, to the location of your hyperlinks.

    Rather than sit by and continue to watch people rip you off, you decide to cloak so you can defend your work and at the same time, start serving your competitors some seriously posioned code.

    In this type of situation, the risks of getting kicked out of even the most strict search engine are pretty slim.

    4. It's Never Enough

    You run a site in a fairly competitive niche. You get a decent amount of traffic from most search engines, but you never seem to show up at the top. You figure if you start cloaking, you'll be better able to pinpoint and exploit the algorithm weaknesses (they all have some) of each search engine. You drool over the incredible R&D advantage cloaking provides, and the thought of being in the top 5 keeps you awake at night.

    You are obsessed.

    You are also the one who will probably get caught and banned. You'll end up giving up steady, long-term traffic, for a quick short term boost.

    If you evaluate your situation and find that you fall somewhere in the first three, you shouldn't have much fear of taking the plunge. If you fall into category #4, understand that there are definite risks.


    11:58 pm on May 15, 2001 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member

    For anyone that is apprehensive about cloaking but just can't seem to get rid of the thought that they are missing something I always recommend you start slow, real slow in fact.

    So slow that at first you serve the exact same pages to both search engines and regular visitors. Only when you arrive at the point where you are comfortable with this set up do you make your first minor move.

    The minor move can be a small change to either the regular visitor page or the search engine page for one engine only. As you get comfortable with that, gradually modify that same page some more. Continue this way, remaining aware of relevancy and everything you already know about spamming, stay on the good side of both.

    Over time as your comfort level grows, then start making changes to pages for the other search engines you are targetting.

    Toolman makes agood point in saying that after a while you'll start to relax. When that happens, then it's time to pay heed to WebGuerilla, it is easy to slip into mode #4 especially when cloaking. Almost everyone I know that cloaks, at some point has experienced the temptation to overdo it, it just seems too easy. Don't be fooled, it isn't easy, and those that succumb to the tempation to overdo it are responsible for half the cloaking horror stories around, the other half which are collectively referred to as FUD, are from successful cloakers :}


    3:00 pm on May 16, 2001 (gmt 0)

    WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

    >at first you serve the exact same pages to both
    >search engines and regular visitors.

    That is a very sage advice. I try to only cloak "as much as necessary" and not stray to far from the original page content. I'm more of a cloaker for the sake of protecting the original source and to give the spider a bread crumb trail to follow. Using that setup, it doesn't take much of a difference from the original page to achieve the desired results.

    Protecting Source:
    You can do very simple things to protect your code:

    • Manipulating the meta tags and descriptions. I tend to use this action. A) I want the competition to see the difference from the search engine to the page. That way, they know the page may be poison and it is best not to rip it off. b) it literally poisons the page and if the competition steals it, there are no rankings to be found.

      There is also merit to keeping the tags and titles the same, so that the se's and competition see the same thing. It keeps the page from saying, "hey look at me, I'm cloaked". However, it almost invites page theft.

    • Changing headers and footers. Manipulating the cloaked pages stock menu's can go quite a ways. One of my older approaches was to just strip away the headers and footers and leave the bare content for the search engine to find. However, in this era of "link happy" search engines, those header and footer menu links are worth something now.

    • Page Headings (h1...h6)
      This is something I still do. Simply dropping the main target keyword out of the h1's on the "visitor page" and including it on the cloaked page, is most often enough of a change to poison the stolen page.

    • Duplicate content.
      One of the older tricks is simply duplicating the page content for the search engine - literal duplication on the cloaked page. Just pump the same page twice and strip the headers/title down to one set. This used to be the killer trick on Ink. They loved it. Doesn't work so well any more as I think it triggers the scramble spam filters.

    • Hidden comments, alt and title attributes.
      Changing the hidden stuff can go a long ways with engines that support the tags.

    Bread Crumb Trails
    With the rise of a quality crawler in Google, cloaking takes on new life as an aid to spidering. Surprisingly and reluctantly, I must admit I cloak for Google more than any other engine. The point of which is to aid in crawling. Many sites just don't crawl well (like this one), and dropping a few links on the page to "less than visible" pages can do wonders for Googles ability to crawl a site. At the risk of personal harm, an example of such:

    That helps it find all the content. I feel that is extremely legitimate usage of cloaking.

    So, there are simple things you can do to protect yourself and help the search engines find your content in benign ways.


    3:14 pm on May 16, 2001 (gmt 0)

    WebmasterWorld Senior Member agerhart is a WebmasterWorld Top Contributor of All Time 10+ Year Member

    I think a thank you to all of you guys that put up these great posts is needed.

    I think it may be time to jump on the cloaking ship.....


    9:14 pm on May 16, 2001 (gmt 0)

    when you cloak for google, how do you prevent from having your cloaked pages showing in google's cache? If google spider spiders the page it sees the cloaked page, then as a result it will put its content in the cache. I stumbled upon some sites where I could see their cloaked pages in the google's cache, that were completely different from the actual page.
    Is there any way of avoiding something like that besides disabling google from caching your page?


    10:12 pm on May 16, 2001 (gmt 0)

    WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

    Google is mostly a link thing for me. Clear graphics or traditional graphics.


    10:14 pm on May 16, 2001 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member

    <meta name="robots" content="noarchive">

    This will prevent google from storing the page in the cache. You can also use various Javascripts that will redirect someone viewing the cached page. Of course, that isn't real full proof because anyone who really wanted to see it could just turn off the java in their browser.

    Black Knight

    11:49 pm on May 16, 2001 (gmt 0)

    10+ Year Member

    The problem with the Robots meta tag is that for a while Altavista took its mere presence as a cue to not spider your page. Their thinking at the time was that only us sneaky types use it much, and its only use they [AV] liked was using it to exclude a page.

    While that did change again, AV are always a very, er, aggressive player when it comes to trying to make the life of an SEO difficult. :)

    I tend to avoid using the robots tag ever since, as it took some considerable time to regain good positions on good pages that had dropped simply because of the robots meta tag.


    Featured Threads

    Hot Threads This Week

    Hot Threads This Month