|Is cloaking still worth the trouble?|
| 3:23 am on Dec 12, 2001 (gmt 0)|
It's a funny question from me, I know. I have my take on the subject. About two years ago Brett declared cloaking dead, he later changed his mind. So, what about today when we have goto dominating the top results on the bulk of the engines, and Inktomi pushing it's xml feeds? We only have one major pure spiderring engine left, and they are very anti cloaking because it ruins their cache.
So what do you all think, is cloaking a seo relic of the twentieth century?
| 4:11 am on Dec 12, 2001 (gmt 0)|
Oddly enough, from my point of view there are days when I think it has run it's course, and then suddenly it doesn't seem like it has.
Even though the major focus is on Google as the last big producer of traffic as far as spidering engines go, as long as AltaVista continues to spider, as well as ArchitextSpider (weird or what?), and of course Inktomi, IMO many out there secretly harbor a feeling that a comeback is in the works. Then there's Fast, forever threatning to become a major traffic producer. This keeps the interest up on methods of promoting to spidering engines.
Then there is Europe, which is a different landscape from a spidering engine point of view, most have their own version of the North American engines as well as their own spidering engines. Since they are about where our engines were 3 years ago, a well optimized cloaked page is almost guaranteed top placement and to stymie the competition.
When all of the other uses for selective page delivery are factored in, it isn't dead by a long shot (although a year ago, I thought by now it would be). I guess I'm a little jaded, I've seen lot's of things come and go, it's inevitable that cloaking as we know it will go the way of the dodo at some point, but now I'm convinced that it won't be through revolution but through evolution, it will simply transform into something else.
But who knows, tomorrow I may feel different.
| 10:13 am on Dec 12, 2001 (gmt 0)|
>It's a funny question from me
I've been thinking about this a lot lately. There are many types and uses for cloaking, unfortunately many associate cloaking with the "brute force" auto generated doorway site. Imo this has led to the situation where a lot of valuable content is left untouched by the SE's. There are many sites out there containing lots of valuable information hidden behind an unidexable structure. Cloaking would allow much of this information to be made accessible.
>>it won't be through revolution but through evolution
To rank well nowadays I believe that the site needs to be strong, well linked and with good directory placements. I feel we are starting to see the beginnings of a hybrid approach, where a good basic site with great content will be made accessible via cloaking. No matter what the SE's say about cloaking one thing is for sure, they are content junkies, it's time to get those hybrid sites built and dare them to deprive their users of that content. imho
| 10:41 am on Dec 12, 2001 (gmt 0)|
hehe. I've changed my mind again Little. Like Air, it's still a love/hate relationship.
The one thing that has changed in the last year for me, is that I carefully target the purpose of cloaking.
When used for code protection (rare these days), I still do it the old fashioned ip and agent way. We/I do have some killer rankings on some se's that I would like to protect. The big change for me, is that it is now a challenge to see how close to the original page, the cloaked page can come, and still produce the same results.
When I use simple cloaking for language, browser, or geo targetting, I just lay it out there with pure agent cloaking. The benefit is that I just throw se's my "lynx optimized" page now.
I'm thinking very strongly about doing that right here at WebmasterWorld. As you know, I've been working with moving WebmasterWorld to a fairly pure system of CSS to get the style tags out of the page. To do that, I feel I must use sniffer routines and custom feed css to Moz, NN, Opera, and IE. What would be leftover without css, would basicaly be a "lynx" page. That in turn works in well with se's that dig style-less bland html code (eg: google).
So cloaking still lives, but it's purpose is changing. Raw cloaking with stuffed doorways, was something I never did to much of myself, and I surely wouldn't advise it to the uninitiated.
But who knows, tomorrow I may feel different.
| 9:47 pm on Dec 16, 2001 (gmt 0)|
How extensively I use cloaking is cyclic. As the SEs revise there algos, what each prefers in basic page design seems to diverge and then, over time, reconverge. Presently I have pages that are prominent on most major engines using the same code. I think little cloaking is needed presently except to protect the code.
| 10:05 pm on Dec 16, 2001 (gmt 0)|
Being the new kid to cloaking in this veteran cloakers thread ;), I can say I have a different outlook from when I started cloaking a year ago as well. Cloaking works and is relatively safe unless you're in a category where the self-righteous internet police roam. I used cloaking to get where I am today but now I don't see the need to cloak to obtain ranking as much as I use it to mess with the internet police pinheads.
It's a necessary tool to use on some sites and I never regret having taken the cloaking plunge.