|Cloaked Page Content and HTML|
| 10:12 pm on Feb 22, 2002 (gmt 0)|
I'm sitting here next to my computer thinking about cloaking and reading the link Air put on the post Robin Nobles, on Cloaking [webmasterworld.com]. And I think to myself, "You know, there is a lot of threads on WebmasterWorld.com about how to optimize your cloaking script and how to best use your cloaking script. Which is the best way? User-Agent Cloaking or IP-Based Cloaking."
So all this is going through my head as I think, we don't really talk too much about cloaked page content. What do people use the most for what search engine. Is an <img alt=""> tag good to use in a cloaked page or is it better to use no pics on your cloaked pages? Just straight content? This sort of questions are the ones going through my head. What is thought of as Spamming in cloaking? How similar to a regular page should it look like? Or should it look similar at all? In other words How do fellow cloakers create there cloaked pages?
Why don't we talk a little about how everyone else does it.
I'll start with me.
For google I use mostly <b> tags and <h1> - <h6> tags in the content of the page. I don't use too much <img> tags for google though. But for Altavista I try to use a little more <img> tags along with the content. The rest is pretty much optimizing it like I would a regular page.
The reason why I want to talk a little bit about this here is because when I started cloaking I was kinda lost on how to make a cloaked page. How do you constitute spamming when making a cloaked page and all that sort of stuff. I was lost. I think that talking a little about it will help others that are beginning to cloak in making good choices and they won't be walking down a path that they have no idea which way to take.
Anyways I hope I'm making sense. I feel like I've been rambling out random stuff now.
| 6:33 pm on Feb 25, 2002 (gmt 0)|
My feeling is to use the same content as shown to human visitors, but to leave out many of the things spiders don't need, etc... for example, taking things out of tables, incorporating framed content directly in the page, etc.
I also use cloaking software to generate links to both cloaked and non-cloaked pages, to randomize text, etc.
| 3:55 pm on Feb 26, 2002 (gmt 0)|
>> I also use cloaking software to generate links to both cloaked and non-cloaked pages, to randomize text, etc. <<
I'm not getting fully what you mean? Could you explain it a little more?
| 4:07 pm on Feb 26, 2002 (gmt 0)|
i've been considering cloaking for some time now but never had the courage to do it. my work to date has been pretty successful without cloaking but i do quite fancy having a go at it.
i was considering using SSI with conditional statements to differentiate between user agents and/or ip addresses. is this a sound method?
the only thing that worries me is that a competitor might complain, then it would be simple for google to compare their index with what they see in the browser.
am i right to be wary?
| 5:26 pm on Feb 26, 2002 (gmt 0)|
What I mean is to remove some of the extraneous HTML that doesn't help with the rankings. You might also want to reorganize the page a bit to put your best body copy near the top of the page, sprinkle a few links within the copy to point to similiarly themed pages on your site or others' sites. I don't think there is anything wrong with using <img> tags, especially when you include "alt text" in the <img> tag. I also like to use forms and put keywords in some of the form elements. I wrote a piece of cloaking software back in 1996-1997 which I now sell commercially (won't mention its name here), and it includes features that allow me to insert random lines/paragraphs of text, links, etc., which makes it easy to create cloaked pages based on templates without having them appear alike.
Welcome to WMW :)
You have the right idea both for your method of cloaking and to be a little wary. I don't recommend cloaking for Google. The best way to keep a competitor from complaining is to make your Meta Description and title the same for both a cloaked page and the "human" page. Following that tip will make it very difficult if not impossible to differentiate between a page that is cloaked and one that is not.
| 7:48 pm on Feb 26, 2002 (gmt 0)|
I pretty much follow a setup like volatilegx mentions. Stripped down html, achieving more text to html ratio, and easier to make that text more prominent. In some cases a site that was already ranking well becomes the cloaked site, that allows the redesign to proceed unencumbered by what the search engines will like and what they won't.
Even silly little things like using an image for a title and each header can make a page look much more professional for the human visitors, and if you can show the search engines text versions of those titles and headers the best of both worlds is achieved.