Please keep url responses to e-mail. Thanks.
Discussion surrounding benefits to non specific sites is most welcome.
> just about all of the search engine representatives came out and publically stated that as long as the content of the cloaked page was totally on topic, they had no problem with the technology.
Until they put that in writing in their submission guidelines, and clearly explain what is acceptable and what is not, I'd be careful. If what they are publicly stating is true, then it changes the landscape considerably.
I am new to this stuff, so this will probably be a stupid question, but while I understand the definition, I really don't understand why you should have to use software to make such a page?
Is it because cloaking hundreds of pages gets tedious or has it another reason? I understand cloaking as making a page that is better optimised for a search engine and linking that page to the rest of the site without being seen by users (no links to this page). Is this wrong?
i'm not an expert cloaker but i believe a simple definition of cloaking is serving pages based on the user agent, or more accurately the ip address, of the machine requesting the page.
this means that google, inktomi, fast and of course humans are all served different pages, each optimised for the individual requestor.
hope this makes sense.
Incywincy has it right... cloaking means serving different content to different viewers of the page. Search engine cloaking usually means keeping a list of IP addresses for search engine spiders. When one of those spiders visits a cloaked web page, the software recognizes it by its IP address and serves it HTML that is optimized to get a good ranking in the search engine. When a human sees the same cloaked page, they get served entirely different HTML.
The purpose is to protect the optimized HTML from being stolen.
Many cloaking software packages also make it easy to create the optimized HTML. They either use templates to quickly create many optimized pages, or they use randomly generated HTML to automatically generate the pages.
To get a good list of cloaking software packages, I recommend visiting ODP at: [dmoz.org...]
lloyd: can you provide a link to where you read that about cloaking?
lloyd, can you give us a list of those search engine representatives (search engines) who publicly stated that cloaking was okay?
...At the most recent Search Engine Strategies Conference (presented by Danny Sullivan) just about all of the search engine representatives came out and publically stated that as long as the content of the cloaked page was totally on topic, they had no problem with the technology. And the search engine representatives said that they understood that when the technology was used responsibly it could even help them improve the accuracy of their search results...
That quote is probably from a year and a half or so ago, most search engines have done an about face on that stance (at least publically).
> That quote is probably from a year and a half or so ago, most search engines have done an about face on that stance (at least publically).
Thanks Air, I was wondering where I saw that statement before. And, the recent SES conference is going on right now is San Jose!
Thank you all for clarifying this to me.
Now I do understand!
|Boy does this throw the gates wide open. |
It's all great until you are reported by your competitors while some se operator is having a bad day.
Can someone please post brief details of anything the se reps say about on-topic "ethical" cloaking at that conference. I think everyone would be really interested in developments
Any regular visitors to this forum going to the Central London meeting in Oct?
I recently attended the Search Engine Strategies 2002 conference in San Jose - my impression from the Search Engine representatives was that cloaking was OK given that it did not distort the user experience - that if used as a tool to allow the search engines to gather the relavent information it was OK.
They are not going to put anything in writing - and for the most part they do want to discourage this sort of thing because so many have abused it. But for instance most web sites cloak to some degree anyway - because cloaking is serving up different pages based on useragent. Most of the web sites I deal with server up different pages for IE, netscape 4.x, netscape 6, Lynx [text], or WAP [wireless phones]. When a spider's useragent is encountered, which version am I serving it? Probaly the most spider-friendly & search engine friendly pages.
So what I was told by a programmer from Google was that they look at sites which are being grossly deceptive, have keywords stuffed, have inbound links from link farms etc - those they will ban or lower the ranking for.
This is just my impression, but I heard it over and over again. The Search Engine's are mainly concerned with the user experience. Are your keyword phrases relevant to your content? Is your content something of value to the user?
Webmaster, Programmer, Analyst
|The purpose is to protect the optimized HTML from being stolen. |
That statement is about 10% true. After interviewing with a well known company in Ohio for a SEO Management position, I full well knew that the company was interested in cloaking for placements sake ONLY. They could have given a rat's posterior about code. It changed weekly. It was so unethical, and the goal was to be that way, that I felt physically ill when I left. Even the ultra-high salary couldn't take out the bad taste in my mouth and turning down the "opportunity" to work with them was relatively easy.......cloaking abuse is nothing more than "bait and switch" tactics that are deceptive to the consumers.
>> The purpose is to protect the optimized HTML from being stolen.
> That statement is about 10% true.
No, I'd say it's at least 80-90% true. Spam can and is done every day without cloaking. Cloaking does not in any way improve rankings over any other entrance page technique.
Anybody who uses cloaking to obtain a ranking for an irrelevent keyword is not only playing a dangerous game with the search engines, risking banning of their whole domain, but is also a fool. Searchers aren't going to stay at a site they aren't looking for.
This is my first post here and I find this discussion very interesting. I am currently using a cloaker for hiding my affiliate links to would be affiliate link thieves. I think you all know what I mean. I purchased my "cloaker" software because I was getting very tired and aggravated by others capturing my affiliate code, putting it into their browser and changing it with their own affiliate I.D. .... I actually caught someone doing this that I had personally sent a link to. Not only did he get a discount on the product, but I lost $25.00 in profit from him doing so. Once I got the cloaker, I noticed a big difference in my conversion rates when placing ads and such. I really don't know how anyone that's an affiliate can go with out it. Thanks for reading my post.
Welcome to WebmasterWorld!
You know, your post is almost deserving of an entirely new thread. What a gem!
I never would have thought of that!
It's a good...no great tip for affiliates.
I saw the product that Majorhitz refers to and unfortunately can't remember the name now. I do remember that it looked very promising. Note that it doesn't cloak in the way we've been discussing in this thread... I wouldn't call what it does cloaking... rather it converts a URL into escaped characters that are difficult to decode unless you know what you're doing.
Thanks Scott and Volatile for responding to my post.
I know I can't leave the URL for the software here, since I advertise it (it would be considered spam), but if you'd like to see it in action, you could always email me.
[edited by: Air at 2:24 am (utc) on Aug. 23, 2002]
I'm very sorry...I just realized that I was in the wrong thread. This forum is brand new to me (just joined today). Again, my appologies.
|That quote is probably from a year and a half or so ago, most search engines have done an about face on that stance (at least publically |
For the record, Air is correct. Those comments came from SES Aug. 2000. (If there is an article attributing those comments to the most recent SES in San Jose, I'd appreciate if someone would send it top me via Sticky Mail).
Most engines have learned that it isn't a good idea to make those kind of statements in public. Although later on in the bar, most SE engineers will admit that that is how the treat cloaking.
Cloaked content does get reviewed on a case-by-case basis, and many times it is allowed to stay in the database.
Yes, the cite isn't from the most recent conference. However, as WebGuerrilla points out, it's still pretty much correct.
Google flat out does not allow cloaking. Do it, and if they catch you, you could find yourself thrown out. That doesn't mean they'll catch you, but they have the most unambiguous policy of any of the crawlers. Having said this, there are the odd rumors that they've allowed the occasional special case for a few selected web sites. I've not yet been able to confirm this.
The rest -- Inktomi, FAST, Ask Jeeves and AltaVista -- do allow cloaking. All of them. This is because they all have "trusted feed" programs where XML content can be fed directly into the search engine.
In an XML feed, you don't send the crawler a page. Instead, you send it something you can envision like a spreadsheet:
URL Title Meta Description Body
[page1.html...] Women's Shoes Best women's shoes on t... We have
[page2.html...] Men's Shoes Best men's shoes on the... We have
[page3.html...] Sport Shoes Best sport shoes on the... We have
Each page has different elements sent to the search engine that get stored in its index. Title tags, description tags, body copy, whatever is allowed, you can set up. The "real" page can be completely different -- indeed, it will be, because a real page would have a ton of surrounding HTML code.
Having said this, all the other crawlers will universally say that the content of the XML feed should be representative of the page the user will see. The page that says it is about women's shoes better be about women's shoes, and so on.
So XML cloaking is tolerated. After that, I think it is generally true that cloaking done by those in CPC-based paid inclusion programs get the next level of acceptance, then those in flat-fee paid inclusio programs a little less. Finally, those who are cloaking without paying any fee are most likely to find themselves pulled because cloaking was seen as a "spam" violation.
It's important to stress that the more the cloaked page deviates from the "real" page, the more likely you are to face problems -- and while paying gets you more tolerance, it is not a get out of jail free card to do whatever you want.
Overall, cloaking does not automatically equal spam (except at Google). Paid programs allow forms of search engine-approved cloaking. If in doubt, explain why you need to employ cloaking and get clarification from the company selling you inclusion. If you have a good reason, you'll probably be able to do it.
We were talking about this thread in email. Quite a few of us, found it really strange. Three of us that have in the past or do cloak all said the same thing: we've had problems cloaking with every search engine but Google.
Not to re-iterate Lloyd's post at the beginning of the thread, but I also am looking into cloaking. I'm actually looking to talk a client out of it, I don't think it fits his needs nor is his competition all that rigorous. Could anyone point me to a site that is currently cloaked in a competive category?
Not sure if that came our right. I want to express to a client that the those who are cloaking have a competive reason to do so, which is not comparable to his situation.
That's strange that Google was the only search engine who didn't mind cloacking. Especially if you were doing paid inclusion with the other search engines.
Well cloaked sites suit all parties - users, search engines and clients. However, cloaking well is NOT the easy way to do SEO. There are a lot simpler, safer and easier ways to get a site up there.
>If anyone is willing to send me some examples I would be most grateful...
You aren't GoogleGuy by any chance? ;)
<Could anyone point me to a site that is currently cloaked in a competive category?>
Google comes to mind. I guess that The search engine business is quite competitive. :)
Cloaking discussions usually turn into a debate on ethics, (absurd in my opinion), Google doesn't like optimization, let alone cloaking and the whole myth that cloaking is cheating is equally absurd but regardless of the views the engines espouse, cloaking pays.
I reported a site that utilized cloaking. One of my sites, and the entire reason for the site's existence was to see how search engines respond to reports of sites that use cloaking. The results? Not one engine dropped the site from its index, including Google. The cloaked pages were relevant to the SERPs. (the IP addresses of all vistors for two months were scanned closely but the data wasn't conclusive, we couldn't determine what IPs the cloak checkers came in on but we did get all the standard IPs from the engines.)
Sullivan already mentioned that the difference in the human viewable page and the SE page is a determining factor but I don't think enough emphasis was place on it. Keeping the pages as similar as possible is extremely important.
If you decide to cloak a site and your disposable income is dependent on the site, have a back-up plan. Cloaked sites do get dropped (not often) and if it's the first time you've cloaked a site you will make mistakes. If the arena is highly competitive your competitors WILL spot your mistakes.
After you finish cloaking your site and optimizing it, the dividends are more than just higher positions in the SERPS. People aren't snagging your affiliate codes, your code is protected (somewhat, as the pages should remain similar), you are free from any design constraints, (including those scripts that are necessary to e-com sites) and the code presented to the engines is much cleaner. (think ease of spidering and keyword proximity).
The cons of course are worrying about keeping that IP list updated and being dropped from the index or relegated to the cellar in the SERPs. Of course, being relegated to the cellar can happen to those sites that DON'T cloak and are considered "safe" by their authors. Dind't the PR0 campaign do just that?
The worry about cloaked sites somehow deceiving the engines is groundless. I have yet to have an e-com site ask me to serve up irrelevant pages and why on earth should they? They want traffic that converts to sales, not traffic for the sake of traffic. They certainly aren't going to hand me 5k to cloak their site for widgets if they sell wodgets. They want the surfers that are interested in buying the very best wodgets available.
I love it when I get cloaking requests. It means I don't have to worry about 10k worth of scripting code on the page for those dynamic menus and I can make sure that the content is the focus and served up as perfect spider food rather than worrying about what plate to serve it up on.
Does cloaking pay? Sure does. Are there risks? You bet. That's why there's cost/risk benefit analysis.
Great post ghost. I couldn't agree more. The argument that kills me the most are those that are categorically against any form of cloaking whatsoever. Do you mind sticky mailing me the domain which you reported? Excellent experiment.
| This 31 message thread spans 2 pages: 31 (  2 ) > > |