homepage Welcome to WebmasterWorld Guest from 54.227.215.139
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

This 129 message thread spans 5 pages: < < 129 ( 1 2 3 [4] 5 > >     
Cloaking Gone Mainstream
Languages, Agents, Doc format, - cloaking is everywhere.
Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 10:26 pm on Jan 30, 2003 (gmt 0)

Cloaking has taken on so many new meanings and styles over the last few years that we are left scratching our heads as to what cloaking really means. Getting two people to agree on a definition is nearly impossible with all the agent, language, geo targeting, and device specific page generation going on today. It is so prevalent, that it is difficult to find a site in the Alexa top 500 that isn't cloaking in one form or another.

This all came up for us in mid December when right at the height of the Christmas ecommerce season, a friends European site was banned or penalized by a search engine. After numerous inquiries, it was learned that the surprising reason for it was cloaking. I got asked to take a look at the site and figure out where their was a problem. The site owner didn't even know what cloaking was, let alone practice it.

I determined that his off-the-shelf server language and browser content delivery program was classifying search engines as a text browser and delivering them a text version of the page. In it's default configuration, this 5 figure enterprise level package classified anything that wasn't IE, Opera, or Netscape as a text browser and generated a printer friendly version of the page that was pure text.

We explained to the SE just what the situation was and they agreed agreed and took off the penalty after we said we'd figure out a way around the agent part. Unfortunately, the package had all but compiled in the agent support and they were surprised when we informed them about it. What was even better was looking around some fortune 500 companies that run the same software to find three entire sites that were in effect "cloaked" - they didn't have a clue.

In the end we solved the problem with another piece of software that would exchange the agent that the site delivery program was seeing. Yep, we installed cloaking software.

So lets have a little run down of the current state of cloaking in it's various forms:

We've talked a bit about about agent based cloaking recently [webmasterworld.com].

Search Engines Endorse Web Services Cloaking:

Cloaking has become just varying shades of gray. We now have instances where search engines themselves endorse cloaking (xml feeds) and in some instances are giving out cloaking software to deliver those xml feeds.

That has resulted in pages intended (cloaked) for one search engine being indexed by another search engine. There have been occasions where this endorsed content has been banned or penalized by another search engine.

Geographic IP Delivery:

Language translations have been a hot topic for the last year. Most major sites now geographic deliver content in one form or another. Hardly a month goes by when someone doesn't scream, I can't get to Google.com because they are transparently redirected to a local tld. You will also find those same search engines custom tailoring results for that IP address (eg: personalized content generation). You can see the effect your self by changing your language preferences on a few search engines that offer the feature.

One Browser Web:

The recent history of major browsers is summed up in IE4-6, and Netscape 3-7. There is also a large 2nd tier of browsers: Opera, Lynx, Icab, and Mozilla.

All of these agents support different levels of code and standards. They also have inherent bugs related to page display. If you are a web designer, you could get a degree in the various browser differences of CSS and HTML alone.

Just when we are starting to think in terms a one browser web, along comes a whole new set of browsers to consider: Set Top Boxes, Cell Phones, PDA's, and other Mobile Devices. These all have varying degrees of support for XML, XHTML, CSS2/3, and the web services protocol blizzard (eg: .net, soap...etal).

We've not even begun to talk about IE7 which is rumored to be in final internal beta testing. Then there is Apples new browser and the growing horde of Mozilla based clones. When you put it in those terms, our one browser web seems like a distant dream.

Delivering different content to these devices is a mission critical operation on many sites. Generating content for mobile devices is a vastly different proposition than delivering an xml feed to a search engine, or a css tricked out page for a leading edge browser.

Given that the combination of vistor ip and user agent can run into hundreds of possibilities, the only valid response is agent and ip cloaking.

Off the shelf cloaking goes mainstream.

There many off-the-shelf packages available today that include cloaking in one form or another. The perplexing part is that many sites are cloake in ways you wouldn't even know about. There are several major forum packages that cloak in some form or another.

I was at a forum this morning that was agent cloaking, and other that was language cloaking. In both cases, the webmasters don't even know that it is taking place - let alone have the tech knowledge to correct it.

Welcome to 2003 - Modern Era Of Search Engines.

This isn't the web of 98-99 where people would routinely get whisked away to some irrelevant site unrelated to their query. Todays search engines are vastly improved with most engine algorithms putting Q&A tests on every page they include. Those range from directory inclusion requirements, inbound link count and quality, to contextual sensitivity and even a pages reputation.

In this modern era where search engines now routinely talk about their latest off-the-page criteria algo advancements, it's clear that traditional se cloaking has little effect. It comes down to one simple fact, those that complain about SE cloaking are simply over looking how search engines work. The search engines have done a fantastic job at cleaning up their results programatically and by hand.

The most most fascinating thing about this new main stream cloaking is the situation where a site just classifies a search engine as a graphically challenged browser. In that case, cloaking becomes mostly a agent based proposition. The trouble starts when you throw in language delivery to the equation, or even delivering specific content as part of a search engine program.

All of these wide ranging factors combined to result in about 10 to the 4th power of page generation possibilities. In that situation, it almost becomes a necessity to put spiders into the all text browser category and deliver the same page to the se's that you deliver to cell phones or the Lynx browser.

Thus, we've come full circle on search engine cloaking. We no longer cloak to deliver custom content to search engines, we now cloak for the search engines to keep them from getting at our cloaked content for visitors.

<edit> cleaned up some typos and syntax errors</edit>

[edited by: Brett_Tabke at 6:15 am (utc) on Feb. 3, 2003]

 

korkus2000

WebmasterWorld Senior Member korkus2000 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 2:06 pm on Feb 3, 2003 (gmt 0)

"Cloaking is getting a search engine to record content for a URL that is different than what a searcher will ultimately see, often intentionally."

IMO this is a better definition and not so exclusionary of new forms of cloaking that we may have not even seen yet.

flowilu

10+ Year Member



 
Msg#: 411 posted 5:52 pm on Feb 3, 2003 (gmt 0)

I know in my original naivety here at WW I had the impression there was some kind of great good I had to adhere to because of the collective nature of the web and it's analogy to the collective of all people. Forget that. This is commerce plain and simple.

Absolutely right.

GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 6:19 pm on Feb 3, 2003 (gmt 0)

I'll chime in with my personal definition, which I think is pretty simple. I think of cloaking as showing different content to a search engine than you would show to a typical user. Google considers that deceptive, and may remove pages/sites that cloak. This has been our policy forever, and is in our webmaster guidelines. Plenty of people have said similar or identical things lately. Alan Perkins wrote a good piece recently about this, for example. I would treat Googlebot just like a regular user.

Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 6:32 pm on Feb 3, 2003 (gmt 0)

>This is commerce plain and simple.

Well, we still want to encourage deep quality sites and quality content, but we also need to defend the rights of site owners to develop and deliver that content as they see fit without regard for the search engines. After all, most of us never submitted our sites to these search engines - they came to us after our content. Every website wants to protect it's IP property from cachers and other page jackers. Cloaking is a legitimate means to do that.

The central idea here about cloaking is that it is gone mainstream in it's many forms and the argument about it is nothing but an intellectual excersize and historical footnote. It's time for everyone to join us in the present.

Cloaking as we knew it, is a matter for the se's to deal with internally and none of our business. If you have a problem with their search results, then send their webmaster a letter of complaint and wait for a reply.

startup

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 6:33 pm on Feb 3, 2003 (gmt 0)

GG, "I would treat Googlebot just like a regular user". Many don't and haven't been effected. The playing field is not even in this matter. Many very large sites employ techniques that should get them removed but, we know they are still listed.
Trying to define what is acceptable for just one SE(Google) is almost impossible. Google does allow cloaking even if they have stated otherwise.

Namaste

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 6:41 pm on Feb 3, 2003 (gmt 0)

I agree with GG & Alan. However GG, the point that korkus made about not keeping up with tech issues is valid; and sites do run the danger of wrong siding Google. This brings us back to the point of you being ultra careful in your banning/blacklisting of sites.

the search engine model is fundamentaly flawed. They have based their entire service on the repackaging of other peoples property. Search engines are nothing more than powerful value added resellers.

I think this a very unfair thing to say about search engines. There are 2 kinds of value adders, those that genuinely bring value and those that don't. A search engine like Google is adding tremendous value; your example of a stock broker is the other kind.

I think it is time that search engines increase their contact with webmasters/programmers and publish even more detailed guidelines. This should assit many in better working with, instead of working against, search engines.

nowhere

10+ Year Member



 
Msg#: 411 posted 6:53 pm on Feb 3, 2003 (gmt 0)

From Webopedia:

Cloaking - Also known as stealth, a technique used by some Web sites to deliver one page to a search engine for indexing while serving an entirely different page to everyone else.

On a different note (or is it the same?); Iím glad to finally know that UA delivery isnít cloaking. I can now sleep well at night knowing that redirecting surfers whose browser name isnít Googlebot to Commission Junction is perfectly acceptable to the Google search engine. I guess my laziness of not wanting to bother with all those I.P. number thingyís paid off.

webwhiz

10+ Year Member



 
Msg#: 411 posted 6:53 pm on Feb 3, 2003 (gmt 0)

GoogleGuy said:

I think of cloaking as showing different content to a search engine than you would show to a typical user.

And Danny Sullivan said this:

"Cloaking is getting a search engine to record content for a URL that is different than what a searcher will ultimately see, often intentionally."

Is this the same thing?

dannysullivan

10+ Year Member



 
Msg#: 411 posted 7:43 pm on Feb 3, 2003 (gmt 0)

Is this the same thing?

Well, I used more words :)

But yes, I agree perfectly with what GG posted. For instance, if you used CSS to show a search engine content that would typically be hidden from a typical user, using a typical browser, that would be a form of cloaking. As such, Google would reserve the right to come down on you for it, if spotted.

WebGuerrilla

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 7:45 pm on Feb 3, 2003 (gmt 0)


This has been our policy forever, and is in our webmaster guidelines.

I think the word forever is a bit of a stretch. During Google's first three years of existance, many public statements regarding cloaking/spam were made that were 180 degrees opposite of the current policies.

makemetop



 
Msg#: 411 posted 7:53 pm on Feb 3, 2003 (gmt 0)

>I think of cloaking as showing different content to a search engine than you would show to a typical user.

As in all tortuous debates, even this statement is not really clear (with all due respect, GG).

Let us take it as read that serving deceptive content ('blue wodgets' for a search for 'red widgets') via any means is disapproved by SEs. However, for the sake of debate, if I had a web page with an animated flash movie displaying the words 'I'm showing everyone the same content' on it and then checked to see if a browser did not have flash enabled and displayed a standard HTML page saying the same thing - is that giving Google the same or different content? How about if I did it by detecting the UA - is it different content then? Or how about by IP - is this different content.

If it is, how is it different? Both the user and the SE see textually exactly the same thing? Or is different content different HTML - even though the user see exactly the same thing as the SE sees in image terms?

In the Alan version, I believe the flash detection and UA examples would not be cloaking, in Danny's, I understand all three should be cloaking (although they may or may not be approved). Any chance of a Google definitve answer on this example? I fully understand if not - but surely this does indicate how all interpretations of the 'rules' and definitions are currently indistinct and fuzzy, including my own - and can mean different things to different people!

Key_Master

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 8:10 pm on Feb 3, 2003 (gmt 0)

Very good points makemetop.

I'd like to point out that since it's "ok" to feed different browsers different pages then by that definition, Googlebot could be considered a browser and treated no differently as long as the content of the pages remained untouched.

Let's assume we have a section reserved for Netscape, a section for MSIE, and a text only section. Each section dishes out the same content, just in a format the browser can understand. Do we feed Googlebot the text version or what? Factors other than content (e.g. code bloat) do have an impact on the ranking of a page.

Ron_Carnell

10+ Year Member



 
Msg#: 411 posted 8:49 pm on Feb 3, 2003 (gmt 0)

Brett talked about all the mainstream applications in his first post here. Those are all wonderful things that can help Webmasters make their sites be the best they can be. But by calling them cloaking, it unfortunately muddies the waters and makes people then think that cloaking in general is okay.

The Mississippi is, by and large, a muddy river. While I applaud Jill's intentions, I don't think we can make the Mississippi less muddy by redefining the meaning of silt. She wants to be able to give her non-technical readers a non-technical answer, and I think that's really cool. I just don't think it's possible. I believe it was Einstein who said, "Everything should be made as simple as possible, but not simpler." When a sixth grade student asks WHY nothing can travel faster than the speed of light, there is no non-technical answer available except "Einstein says so." It's clear from Jill's posts, and of course she has readily admitted, that she doesnít really comprehend the technical side of cloaking. So she wants to be able to say, "Alan says so."

Alan, on the other hand, does understand the technology involved. I get the impression his goal is to resolve the all-too-obvious conflict between what he knows the technology can do with what the search engines have publicly said about cloaking. The tourism bureau has told him how beautiful the Mississippi is, so he insists it must not have any silt clouding its waters. That doesn't seem to be going over too well, probably because most of us realize we still can't see the bottom of the river.

Danny, I think, is much closer to reality, and even tries to give Jill the simple answer she seeks. However, in adding the adjectives "approved" and "unapproved" to the SE's own definition of cloaking, he again muddies the waters by putting words into their collective mouths. And, he doesn't really address Alan's problem with the conflict between what the technology is and what the search engines publicly say about it.

But, then, when did we start believing what the search engines say?

The little road that runs through my local village has a posted speed limit of 25 miles per hour. If the sheriff catches me going faster than that, I'm in danger of getting a ticket. If I ask why I'm getting a ticket the first answer will probably be "because it's the law," but if I dig more deeply, someone will eventually explain that speeds in excess of the posted limit invariably result in far more traffic collisions. From my point of view, it would be much more fair if the law gave tickets only for collisions (hey, I can handle my car just fine at 40 mph!), but I'm realistic enough to realize that isn't likely to happen. The posted speed limit certainly doesn't prevent all car accidents, but they've learned through experience that it prevents more than if they didn't have the speeding laws.

The search engines, especially Google, are passing speeding laws in hopes of preventing car accidents. They know, because it's pretty obvious, that cloaking can be used to fool their algorithms. I'm sure they also know that cloaking can be used for legitimate reasons, too. But, by and large, their algorithms can't really tell the difference between the two and, frankly, they have little real reason to care. It is much easier for them to simply say "No cloaking," just as it is easier (and, yea, probably safer) for my local constable to say "No speeding." The search engines know, just as the police do, that enforcement is going to be selective. It has to be, because neither the search engines nor the police have the resources to do anything else. Both, it seems, hope that their laws will at least curb "most" of the problems.

The title of Alan's article is "Why Cloaking Is Always A Bad Idea." Change the morally-shadowed "Bad" to "Potentially Dangerous" and I see no reason why Cloaking needs to be redefined. As Danny said, unless you have specific permission to cloak, you run the very real danger of getting a speeding ticket. Even if you have a good reason for cloaking, be it Flash or GeoTargetting, you are STILL breaking the SE law against cloaking and running a risk. Does that make it a bad law? Probably. But do the search engines really care? They want to eliminate the spammy cloaking and I suspect they're perfectly willing to try frightening off ALL of the cloaking in the process. Getting rid of the spam helps them. Getting rid of the non-spam doesn't hurt them. What would you do?

If you get a speeding ticket unjustly, as all too often happens, you can fight it in court with a fairly good chance of success. It's not necessarily fun, but at least you have a set court date to argue your case. If you are penalized for cloaking unjustly, as Brett described in his opening post, you can also appeal. But there is no court date and getting someone to listen to you is, at best, questionable. Does that mean you should never run the risk of cloaking? Like speeding, I suspect it depends on just how badly you feel the need to go fast. And on how much the speeding ticket will end up costing you. We each run a cost-benefit analysis every time we get behind the wheel.

Alan, I think you can redefine all the words you want and will still fail at erasing the inconsistencies between what the search engines say and what they actually do. The only ones who can erase those inconsistencies are the search engines and I doubt any of us will hold our breath while we wait for that happen.

Jill, I think you need to tell your readers what cloaking generally means (both Danny's and GG's definitions are good) and then explain that the only SIMPLE answer is to avoid it entirely. Tell them there are much more complex answers but, like the sixth grader who wants to know WHY nothing can travel faster than the speed of light, they'll need to brush up on some high school algebra before they can understand them. Then, point them towards this thread. :)

homegirl

10+ Year Member



 
Msg#: 411 posted 9:23 pm on Feb 3, 2003 (gmt 0)

WebWhiz asked "Is this the same thing?"
Does:
I think of cloaking as showing different content to a search engine than you would show to a typical user.
[Googleguy definition]
=
Cloaking is getting a search engine to record content for a URL that is different than what a searcher will ultimately see, often intentionally.[Danny Sullivan definition]

Danny's already responded to this. However, I think Danny's definition is a more useful one (and it's finer-grained). Picture the scenario in which different content *is* shown to the search engines (as a response to their particular limitations). The content isn't spam, isn't keyword-stuffed, but provides the search engine with info as to what the page or event the overall site is about. The search engine processes this but properly directs the searcher to the relevant page (be it flash-intensive, in frames, whatever). This scenario is cloaking, according to Google. However, it's not meant to mislead either the search engines or their searchers. As opposed to Scenario #2 in which technology is employed to serve a different page to the search engines with content that is irrelevant or poorly relevant to the site's purpose, theme as a whole.

That's an important distinction. I can see the search engines getting upset over returning irrelevant results to their users. [Results due to their search engines recording different content.] The key here is determining what might be considered deceptive and why. In the first scenario I describe above, the fact that different content is served to the search engines- is not in itself intended to be deceptive. And if the search engines direct their users to the appropriate page for their search, the users do not themselves feel deceived.

While I wish we could treat search engines as typical users, the truth is they're not because they are as limited (currently) as their algorithms. A human could look at a flash site and see if it's a good match for their search; most search engines cannot unless there's a corresponding html site. And so on.

Practicality: don't cloak (according to Google's definition) unless you're willing to hazard the risks.

But for now, I prefer Danny Sullivan's original definition since it explicitly states what's problematic for the search engines: what's indexed is different than what's seen by the searchers (and this gulf can make the search engines look bad- and cause their users to go elsewhere for better matches/higher relevancy). Also, it leaves flexible what may be penalized in the future as search algorithms improve (in terms of the variety of content to spider and index).

GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 10:21 pm on Feb 3, 2003 (gmt 0)

WebGuerrilla, thanks for correcting that. I believe Google released its first webmaster section in June 2001, and I think we've been pretty consistent at least since then.

I like the way that Danny said it too. I'm honestly just trying to make sure Google's perspective is heard on this issue. If someone decides to cloak, they should be informed about how search engines view cloaking.

startup

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 10:49 pm on Feb 3, 2003 (gmt 0)

"I'm honestly just trying to make sure Google's perspective is heard on this issue."
GG, this is very misleading.
"Google considers that deceptive, and may remove pages/sites that cloak."
"May remove" is where the confusion starts. Google does not treat all cases of cloaking equally.

volatilegx

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 3:25 pm on Feb 4, 2003 (gmt 0)

I don't think that's misleading at all... "may remove" means Google has the option to remove at their discretion. Sounds perfectly reasonable to me.

Alan Perkins

10+ Year Member



 
Msg#: 411 posted 8:40 pm on Feb 4, 2003 (gmt 0)

Sorry to have been away from this for a couple of days.
Cloaking is getting a search engine to record content for a URL that is different than what a searcher will ultimately see, often intentionally.[Danny Sullivan definition]

Hmmm, what IS the intent? :)

Here's a summary of my position:

XML feeds: Not cloaking
Geo-IP Delivery: Not cloaking
Delivering different content to different browsers: Not cloaking
Bait and switch: Not cloaking
Meta tags: Not cloaking
Noframes content: Not cloaking
ALT text: Not cloaking

All of the above methods may be used to deliver unwanted content to search engines - none of them are cloaking. All the above techniques have uses outside of search engines. Cloaking does not.

Cloaking is

a) detecting a search engine
b) delivering content which the SE "thinks" is designed to be seen by humans
c) actively preventing humans seeing the same content
d) all in an attempt to rank higher.

The "in an attempt to rank higher" is important, because certain things (such as prevention of spider abuse) are necessary to protect your site. Banning any visitor by IP, including a spider, would not be cloaking.

Here's why search engines don't like cloaking:

1) Search engines use two broad sets of criteria to rank pages: on-the-page criteria and off-the-page criteria
2) On-the-page criteria are based on what the SE believes some or all users will see, i.e. the content on the page
3) Cloaking involves delivering content to a SE that NO user will see, when the SE thinks that users will see that content (i.e. thinks it is on-the-page that the user sees)
4) The SE therefore assigns a ranking partly based on on-the-page criteria which are meaningless, since no user will see the content.

Here's why XML feeds are not cloaking:

1) The search engine knows what they are
2) The search engine *can* therefore weight them accordingly in the ranking algo (whether they *do* weight them accordingly is another matter)
3) i.e. XML feeds are known to be off-the-page and can therefore be treated accordingly - like any meta data.

keeper

10+ Year Member



 
Msg#: 411 posted 11:18 pm on Feb 4, 2003 (gmt 0)

This has been an awesome thread! Permit me to chime in with my opinion :)

Personally Danny's description resonates with me the most.

Alan, trying to divorce the word "cloaking" from the actual technology is going to confuse a lot of people. Indeed, your assertion that cloaking = intent to decieve a search engine, but bait and switch is ok, is really confusing me.

If I serve a text only page to text only browsers, lets say to "to customise the user experience" (something that was previously mentioned, and you outline as #3 in your "ok list" above: "Delivering different content to different browsers: Not cloaking ")
(note: for argument sakes it is exactly the same text only formatted differently)
I am not cloaking because my intent is to customise the user experience.
But, a large bi-product of my activity is fantastic search engine rankings, because my text pages have keywords in the title, description and heading tags.

On the other hand, I could perform the exact same technical set up of my content, but my intent is to gain search engine rankings, and the bi-product is a customised user experience.

How then does Googlebot (an automated program) know whether i am wearing a halo or horns?

Your position not only muddles my perception of what cloaking is - a technology, but it is also rendered pratically useless in real world applications.

The only way to determine intent is hand editing by search engine staff, and in the example i described, it would be difficult to determine, even if they did have the resources to carry it out... who really knows my intent but me anyway. And, if you carry out the cloaking correctly, you are not decreasing the value of Google or any other engine as an Internet navigation tool (in fact in most cases you actually increase it by providing content that a spider can "read") your client can be happy that they recieve increased traffic and the surfer is happy to have found what they are looking for.

All without having to know my intent.

Alan Perkins

10+ Year Member



 
Msg#: 411 posted 11:32 pm on Feb 4, 2003 (gmt 0)

Alan, trying to divorce the word "cloaking" from the actual technology is going to confuse a lot of people

The word IS divorced from the technology, and that is why people who don't think it is are confused! ;)

The technology we are talking about is content delivery. My understanding of the word "cloaking" is using content-delivery technology to deliver unsolicited unique content to search engines. Most search engines don't like this, for the reasons I mentioned above. Google and Inktomi clearly state on their sites "Don't cloak". Their understanding of the word seems to match mine, at least from what they publish. :)

Nobody who isn't an SEO (or at least hasn't been exposed to SEO influences) even uses the word "cloaking". Try searching www.w3.org for the word, for example. The word is only used within our little industry on the Web. Ask Tim Berners Lee what he thinks of cloaking, and he's likely to say "What's that? Klingon technology?" But he understands all about content delivery...

jeremy goodrich

WebmasterWorld Senior Member jeremy_goodrich us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 11:49 pm on Feb 4, 2003 (gmt 0)

The word IS divorced from the technology

To who? All the people that cloak that I know of don't get confused when they are talking about cloaking. We've been talking about it for years, here - the forum is even called, 'cloaking' mate - and as far as I know, nobody has gotten confused by the word.

:) Why else would you be posting in a cloaking forum talking about cloaking if the word wasn't clear?

Alan Perkins

10+ Year Member



 
Msg#: 411 posted 11:52 pm on Feb 4, 2003 (gmt 0)

Indeed, your assertion that cloaking = intent to decieve a search engine, but bait and switch is ok, is really confusing me

I never said it was OK. Just that it wasn't cloaking. I'm trying to keep rights and wrongs out of this. Just trying to define a word so that we all mean the same thing when we say it. :)

Why else would you be posting in a cloaking forum talking about cloaking if the word wasn't clear?

Because it isn't! When, say, Inktomi answers the question "Is cloaking permitted?" with the one word "No", do they mean that XML feeds aren't permitted, Geo-IP isn't permitted, etc etc etc. This was the kind of contradiction that Jill was getting questions about.

korkus2000

WebmasterWorld Senior Member korkus2000 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 12:04 am on Feb 5, 2003 (gmt 0)

Alan, how many sites have you had banned or penalized by Google or any other engine and the reason given was cloaking? It seems that people in this thread have been penalized and banned for cloaking by using "techniques" that you don't consider cloaking. I think if we are trying to come up with a basic definition that beginners can follow without confusion, then we need to add these "techniques" that you don't consider cloaking. If it is detrimental to their web site's health, then don't mislead them.

For most of us here, we are not beginners. I know several people in this thread who have a PR0 and/or banned sites to prove they are graduates from the school of hard knocks. With most issues, there is no easy answer.

I just get the feeling you have a hidden agenda here. Why are you and Jill taking on the world about a definition? We can either agree that the search engines make the definition by how they enforce their rules, notice I didn't say by what they publish on their site, or what is the common usage among the people who use the word. If we are doing something else we are trying to call a stick a rock again. ;)

Alan Perkins

10+ Year Member



 
Msg#: 411 posted 12:14 am on Feb 5, 2003 (gmt 0)

Alan, how many sites have you had banned or penalized by Google or any other engine and the reason given was cloaking?

None. But then I don't cloak.

I just get the feeling you have a hidden agenda here. Why are you and Jill taking on the world about a definition?

Hidden agenda? No. Jill's subscribers asked Jill questions, Jill asked me to write an article, I wrote one, people disagreed with it, I defended it, that's it. :)

NFFC

WebmasterWorld Senior Member nffc us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 12:30 am on Feb 5, 2003 (gmt 0)

>Nobody who isn't an SEO (or at least hasn't been exposed to SEO influences) even uses the word "cloaking".

lol, its a very common word and used a great deal in the "real word". I think the attempt to nail it down to a very narrow definition is wrong, let people define their own meaning based on their own experiences and aims.

In the study of ideas, it is necessary to remember that insistence on hard-headed clarity issues from sentimental feeling, as it were a mist, cloaking the perplexities of fact. Insistence on clarity at all costs is based on sheer superstition as to the mode in which human intelligence functions. Our reasonings grasp at straws for premises and float on gossamers for deductions.

Quote Source [www-gap.dcs.st-and.ac.uk]

Alan Perkins

10+ Year Member



 
Msg#: 411 posted 12:44 am on Feb 5, 2003 (gmt 0)

My fault. :) I said:

The word is only used within our little industry on the Web

I meant: Within the Web technology industry as a whole, the word "cloaking" only has meaning to SEOs (or people influenced by SEOs).

I know the word has a meaning outside of the Web. :)

I think the attempt to nail it down to a very narrow definition is wrong, let people define their own meaning based on their own experiences and aims.

I think the meanings of words are important. Why should "cloaking" be any different?

startup

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 2:22 am on Feb 5, 2003 (gmt 0)

Alan, you can't answer Jill's question. The SEs will not let us know what is acceptable and what isn't.
If Ink says no you can't use GEO-IP and google says you can, how are you going to use the technology, (whatever you want to call it), on a site listed in both Ink and Google?If you do use the "technology" on the same site in the above example what is that called?

webwhiz

10+ Year Member



 
Msg#: 411 posted 4:32 am on Feb 5, 2003 (gmt 0)

It sounds like now you guys are mixing together the definition of cloaking with search engine "spam."

In your broad definition of cloaking, i.e., saying that all content delivery is cloaking, or saying that trusted feed is cloaking, those things do not automatically equal spam.

It seems that you're saying that anything that might get you penalized, must be cloaking, however. And that's just not true, of course.

Cloaking (using Alan's definition) equals spam.

Spam, however, doesn't always equal cloaking. Yes, you can use those other technologies to spam, but that doesn't mean you're cloaking.

You can spam with trusted feed, still, it doesn't mean it's cloaking. You can spam with UA delivery, but it doesn't mean that it's always cloaking.

Alan was not trying to define spam (this time around...he does have another paper on that, if anyone wants to go some more rounds!). He was simply trying to define cloaking.

With his definition, cloaking always is a bad idea (as the title of the article said). But other types of what some of you call "cloaking" aren't always a bad idea, and they're not necessarily spam either (although they can be used to spam).

BUT...it's correct that if the search engines are penalizing for things they shouldn't be penalizing for, such as UA delivery, then we definitely need to warn people about that too. Still doesn't mean that we *need* to say that it's cloaking. Just that the engines may be confused and *think* that you're cloaking...so use your own best judgement if you use those technologies.

Hopefully, the engines will catch up soon, and know real cloaking when it sees it. If they would agree to Alan's definition, it would make things a lot easier for everyone!

4eyes

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 411 posted 1:42 pm on Feb 5, 2003 (gmt 0)

Google considers that deceptive, and may remove pages/sites that cloak.

GG - they key word here is 'may'

Can you clarify why you choose to use this word instead of 'will'.

If you really intend to act against cloaking, you should perhaps rephrase this as:

Google considers that deceptive, and if we discover that a site is cloaking, we WILL remove pages/sites that cloak.

...but I guess you thought long and hard before deciding not to use that wording.

I am interested in your reasoning here.

korkus2000

WebmasterWorld Senior Member korkus2000 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 411 posted 1:52 pm on Feb 5, 2003 (gmt 0)

saying that all content delivery is cloaking

No one has said that.

Cloaking is getting a search engine to record content for a URL that is different than what a searcher will ultimately see, often intentionally.

I believe that is more along the lines of what is being said.

It sounds like now you guys are mixing together the definition of cloaking with search engine "spam."

This is what you and Alan are doing. People are trying argue they are seperate entities. You say cloaking=spam. Most in this thread are trying to make a distinction between the two.

Just that the engines may be confused and *think* that you're cloaking

Now you say the engines are confused by not excepting your definition. Very odd arguement.

Hopefully, the engines will catch up soon, and know real cloaking when it sees it. If they would agree to Alan's definition, it would make things a lot easier for everyone!

Hopefully Websters will start calling a stick a rock! I think agreeing to Alans definition will hurt beginners more than clear anything up.

You guys are on a marketing campaign to change a words meaning. In Atlanta we had the same thing during the Olympics. People wanted to make Atlanta have a different meaning to people. Didn't change a thing. It just wasted tax payers dollars. If you want your own definition create a new word. Just don't get upset as, through usage, the definition changes.

Alan Perkins

10+ Year Member



 
Msg#: 411 posted 2:35 pm on Feb 5, 2003 (gmt 0)

You say cloaking=spam. Most in this thread are trying to make a distinction between the two.

I was trying to keep spam out of this. But, since Google and Inktomi both categorically state that cloaking is not wanted, it seems they at least think it is spam. The question is, do they think of cloaking in the same terms as Brett's first post in this thread? I don't think they do. Inktomi, for example, even states:

If the purpose is to serve alternate pages to different human users, based on locality, browser, machine type etc., we do not consider that cloaking.

I think agreeing to Alans definition will hurt beginners more than clear anything up.

This isn't my definition. We are talking about what SEOs have traditionally meant when using the word "cloaking", and what search engines mean now when they use the word.

This forum was once called "Cloaking - Stealth", and was in "The SEO World" section of WebmasterWorld. When did that stop?

You guys are on a marketing campaign to change a words meaning.

No, not me. :) Look in the WebmasterWorld Cloaking Forum Charter [webmasterworld.com]:

Cloaking is delivering one version of a page to a search engine spider while serving users another version.

That's a little bit different to some of the things that have been described in this thread as cloaking.

Now look in the WebmasterWorld Cloaking Forum Library [webmasterworld.com] and you'll find a thread from Oct 2, 2000 called Comment on cloaking from a SE [webmasterworld.com]. I participate in that thread, and I'm using exactly the same definition of cloaking then as I'm using now, 2.5 years later. Note how everyone else in that thread is talking about IP delivery and UA delivery to search engines. That's what cloaking was and still is, as far as I am concerned. And I think, from their web sites, that Google and Inktomi at least agrees.

This 129 message thread spans 5 pages: < < 129 ( 1 2 3 [4] 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved