Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
I have been weighing the benefits and risks myself, and I can't seem to come to a decision.
you should check this out though: http://webmasterworld.com/forum24/212.htm [webmasterworld.com]
Believe me I know what you are talking about, that is why I am such an advocator of themes......they're safe.
If I were you I would try it out with that extra IP that you have laying around. I mean, what can you lose?
I think we need some advice from the experts around here.
SK you have a terrific understanding of the way a cloaking script should work so the obvious little snafus I learned the hard way shouldn't hinder you at all.
Usually, the people who come to me wanting help with cloaking fall into one of these four types.
1. Never Had It, Never Will
Your site is a programming masterpiece. Full of great content, completely dynamic, and extremely appealing to humans. The only downside is you have never had a single visitor arrive at your site from a search engine. Redesigning the site isn't an option because your conversion rates are quite good.
For sites like these, there are no downsides to cloaking. The threat of getting banned from a database you've never been in in the first place isn't something you should spend much time worrying about.
2. You've Got It, But You Might Loose It
You first built your site in 95. Slowly, over time you became the most relevant and dominant site in your niche. You get a great deal of traffic, but that old, static site has become far too difficult to maintain. It's time to move to a database driven system with all kinds of cool content mangement and ecommerce tools. URL's are going to change, and the overall site navigation will be changing as well.
For situations like this, cloaking is also an ideal solution. If you do nothing, you will loose the search engine traffic you worked long and hard for. If you cloak, you can preserve the old site (and listings) while at the same time seamlessly route human visitors to your new content.
3. It's Payback Time
Like situation #2, you were the first one to move into your neighborhood. When you got started, there were about 50 competing web pages in your niche. Now there are 50,000. All of a sudden, everytime you sit down to do a little competitive research, you find some new johnny-come-lately using chunks of your code. Everything from the wording of your H1 tags, to the location of your hyperlinks.
Rather than sit by and continue to watch people rip you off, you decide to cloak so you can defend your work and at the same time, start serving your competitors some seriously posioned code.
In this type of situation, the risks of getting kicked out of even the most strict search engine are pretty slim.
4. It's Never Enough
You run a site in a fairly competitive niche. You get a decent amount of traffic from most search engines, but you never seem to show up at the top. You figure if you start cloaking, you'll be better able to pinpoint and exploit the algorithm weaknesses (they all have some) of each search engine. You drool over the incredible R&D advantage cloaking provides, and the thought of being in the top 5 keeps you awake at night.
You are obsessed.
You are also the one who will probably get caught and banned. You'll end up giving up steady, long-term traffic, for a quick short term boost.
If you evaluate your situation and find that you fall somewhere in the first three, you shouldn't have much fear of taking the plunge. If you fall into category #4, understand that there are definite risks.
So slow that at first you serve the exact same pages to both search engines and regular visitors. Only when you arrive at the point where you are comfortable with this set up do you make your first minor move.
The minor move can be a small change to either the regular visitor page or the search engine page for one engine only. As you get comfortable with that, gradually modify that same page some more. Continue this way, remaining aware of relevancy and everything you already know about spamming, stay on the good side of both.
Over time as your comfort level grows, then start making changes to pages for the other search engines you are targetting.
Toolman makes agood point in saying that after a while you'll start to relax. When that happens, then it's time to pay heed to WebGuerilla, it is easy to slip into mode #4 especially when cloaking. Almost everyone I know that cloaks, at some point has experienced the temptation to overdo it, it just seems too easy. Don't be fooled, it isn't easy, and those that succumb to the tempation to overdo it are responsible for half the cloaking horror stories around, the other half which are collectively referred to as FUD, are from successful cloakers :}
That is a very sage advice. I try to only cloak "as much as necessary" and not stray to far from the original page content. I'm more of a cloaker for the sake of protecting the original source and to give the spider a bread crumb trail to follow. Using that setup, it doesn't take much of a difference from the original page to achieve the desired results.
You can do very simple things to protect your code:
There is also merit to keeping the tags and titles the same, so that the se's and competition see the same thing. It keeps the page from saying, "hey look at me, I'm cloaked". However, it almost invites page theft.
Bread Crumb Trails
With the rise of a quality crawler in Google, cloaking takes on new life as an aid to spidering. Surprisingly and reluctantly, I must admit I cloak for Google more than any other engine. The point of which is to aid in crawling. Many sites just don't crawl well (like this one), and dropping a few links on the page to "less than visible" pages can do wonders for Googles ability to crawl a site. At the risk of personal harm, an example of such:
That helps it find all the content. I feel that is extremely legitimate usage of cloaking.
So, there are simple things you can do to protect yourself and help the search engines find your content in benign ways.
While that did change again, AV are always a very, er, aggressive player when it comes to trying to make the life of an SEO difficult. :)
I tend to avoid using the robots tag ever since, as it took some considerable time to regain good positions on good pages that had dropped simply because of the robots meta tag.