| This 129 message thread spans 5 pages: < < 129 ( 1  3 4 5 ) > > || |
|Cloaking Gone Mainstream|
Languages, Agents, Doc format, - cloaking is everywhere.
Cloaking has taken on so many new meanings and styles over the last few years that we are left scratching our heads as to what cloaking really means. Getting two people to agree on a definition is nearly impossible with all the agent, language, geo targeting, and device specific page generation going on today. It is so prevalent, that it is difficult to find a site in the Alexa top 500 that isn't cloaking in one form or another.
This all came up for us in mid December when right at the height of the Christmas ecommerce season, a friends European site was banned or penalized by a search engine. After numerous inquiries, it was learned that the surprising reason for it was cloaking. I got asked to take a look at the site and figure out where their was a problem. The site owner didn't even know what cloaking was, let alone practice it.
I determined that his off-the-shelf server language and browser content delivery program was classifying search engines as a text browser and delivering them a text version of the page. In it's default configuration, this 5 figure enterprise level package classified anything that wasn't IE, Opera, or Netscape as a text browser and generated a printer friendly version of the page that was pure text.
We explained to the SE just what the situation was and they agreed agreed and took off the penalty after we said we'd figure out a way around the agent part. Unfortunately, the package had all but compiled in the agent support and they were surprised when we informed them about it. What was even better was looking around some fortune 500 companies that run the same software to find three entire sites that were in effect "cloaked" - they didn't have a clue.
In the end we solved the problem with another piece of software that would exchange the agent that the site delivery program was seeing. Yep, we installed cloaking software.
So lets have a little run down of the current state of cloaking in it's various forms:
We've talked a bit about about agent based cloaking recently [webmasterworld.com].
Search Engines Endorse Web Services Cloaking:
Cloaking has become just varying shades of gray. We now have instances where search engines themselves endorse cloaking (xml feeds) and in some instances are giving out cloaking software to deliver those xml feeds.
That has resulted in pages intended (cloaked) for one search engine being indexed by another search engine. There have been occasions where this endorsed content has been banned or penalized by another search engine.
Geographic IP Delivery:
Language translations have been a hot topic for the last year. Most major sites now geographic deliver content in one form or another. Hardly a month goes by when someone doesn't scream, I can't get to Google.com because they are transparently redirected to a local tld. You will also find those same search engines custom tailoring results for that IP address (eg: personalized content generation). You can see the effect your self by changing your language preferences on a few search engines that offer the feature.
One Browser Web:
The recent history of major browsers is summed up in IE4-6, and Netscape 3-7. There is also a large 2nd tier of browsers: Opera, Lynx, Icab, and Mozilla.
All of these agents support different levels of code and standards. They also have inherent bugs related to page display. If you are a web designer, you could get a degree in the various browser differences of CSS and HTML alone.
Just when we are starting to think in terms a one browser web, along comes a whole new set of browsers to consider: Set Top Boxes, Cell Phones, PDA's, and other Mobile Devices. These all have varying degrees of support for XML, XHTML, CSS2/3, and the web services protocol blizzard (eg: .net, soap...etal).
We've not even begun to talk about IE7 which is rumored to be in final internal beta testing. Then there is Apples new browser and the growing horde of Mozilla based clones. When you put it in those terms, our one browser web seems like a distant dream.
Delivering different content to these devices is a mission critical operation on many sites. Generating content for mobile devices is a vastly different proposition than delivering an xml feed to a search engine, or a css tricked out page for a leading edge browser.
Given that the combination of vistor ip and user agent can run into hundreds of possibilities, the only valid response is agent and ip cloaking.
Off the shelf cloaking goes mainstream.
There many off-the-shelf packages available today that include cloaking in one form or another. The perplexing part is that many sites are cloake in ways you wouldn't even know about. There are several major forum packages that cloak in some form or another.
I was at a forum this morning that was agent cloaking, and other that was language cloaking. In both cases, the webmasters don't even know that it is taking place - let alone have the tech knowledge to correct it.
Welcome to 2003 - Modern Era Of Search Engines.
This isn't the web of 98-99 where people would routinely get whisked away to some irrelevant site unrelated to their query. Todays search engines are vastly improved with most engine algorithms putting Q&A tests on every page they include. Those range from directory inclusion requirements, inbound link count and quality, to contextual sensitivity and even a pages reputation.
In this modern era where search engines now routinely talk about their latest off-the-page criteria algo advancements, it's clear that traditional se cloaking has little effect. It comes down to one simple fact, those that complain about SE cloaking are simply over looking how search engines work. The search engines have done a fantastic job at cleaning up their results programatically and by hand.
The most most fascinating thing about this new main stream cloaking is the situation where a site just classifies a search engine as a graphically challenged browser. In that case, cloaking becomes mostly a agent based proposition. The trouble starts when you throw in language delivery to the equation, or even delivering specific content as part of a search engine program.
All of these wide ranging factors combined to result in about 10 to the 4th power of page generation possibilities. In that situation, it almost becomes a necessity to put spiders into the all text browser category and deliver the same page to the se's that you deliver to cell phones or the Lynx browser.
Thus, we've come full circle on search engine cloaking. We no longer cloak to deliver custom content to search engines, we now cloak for the search engines to keep them from getting at our cloaked content for visitors.
<edit> cleaned up some typos and syntax errors</edit>
[edited by: Brett_Tabke at 6:15 am (utc) on Feb. 3, 2003]
The problem imo is that google yet again has taken an industry word and given it it's own meaning.
customer : "I get so much spam it's unbelievable"
webmaster : "yes white text on a white background"
customer : "no... in my outlook mail box"
|It's still cloaking, it's just sloppy cloaking.. |
Actually, I'd call it sloppy coding. With CSS there's no excuse for these types of redirects. If your market caters to older browsers or handhelds, offer a text version.
Using Flash is not (and never was) an excuse for cloaking. Offer a text version to your visitors.
|The term "cloaking" is used to describe a website that returns altered webpages to search engines crawling the site. In other words, the webserver is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they'll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings. |
Seems pretty clear to me.
Rule of thumb: - if you need to know a search engine's IP address or some details from its HTTP request (e.g., its user agent name) in order to deliver content, you are *probably* cloaking. If you don't need that information, then you are certainly not cloaking. [or almost certainly, anyway]
However, checking that it's a robot by examining the IP or HTTP request, and serving content to the robot that you have no intention of showing to searchers, is cloaking. Search engines classify it as spam - at best unwanted, at worst irrelevant and deceptive.
Brett is using the term Cloaking as what it means to the webmaster community. webwhiz you are try to create your own definition and make everyone agree with you. You can't start calling a stick a rock and make everyone do the same. It may make perfect sense to you to call a stick a rock, but not everyone will agree with you.
I heard a discussion the other day about stop words, poison words, and adult words. Their meanings have changed over time and are being used differently. It is the accepted definition that wins out. To refer to what Cloaking means to the majority of web people look at Brett's original post. If you want to have your own definition I suggest you create a new word. I think you should look at georgeek suggestion in message 8.
<added>Welcome to WebmasterWorld Alan Perkins</added>
|..."cloaking" is used to describe a website that returns ... different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. |
It all depends on what you mean by distort, eh? ;)
"With CSS there's no excuse for these types of redirects"
In my example I wasn't using it in reference to serving pages to older browsers or handhelds. I was using it in reference to serving different optimzed pages to differen SE's.
Welcome to WebmasterWorld [webmasterworld.com] Alan!
Your argument and definition is extremely interesting, if only because it's so outstandingly flawed and uniquely contradictory.
if you need to know a search engine's IP address or some details from its HTTP request (e.g., its user agent name) in order to deliver content, you are *probably* cloaking. If you don't need that information, then you are certainly not cloaking. [or almost certainly, anyway]
You don't sound particularly certain of your own arguments...?
By this reasoning if I'm carrying an axe, i'm probably an axe m*rderer right?
Oooohhhh please don't go there. That's outrageous...
Nothing personal intended in picking on your argument here Alan, it's nice to have someone with such an interesting and unique perspective on board ;)
|Brett is using the term Cloaking as what it means to the webmaster community. webwhiz you are try to create your own definition and make everyone agree with you. |
All Jill was trying to do was clear up some of the confusion around the word "cloaking". She gets lots of letters from her subscribers saying things like "Google says don't cloak, but I read somewhere that Google cloaks." I offered to write an unequivocal article for her. Of course, I didn't expect everyone to agree with it straight off the bat...but it does explain the apparent contradiction between what search engines say and what search engines do.
I use many of the expressions Brett used in his opening post. I don't consider *any* of them cloaking. When search engines say don't cloak, they mean "don't hide stuff from us that we need to see in order to rank this page". In other words, don't have one page for visitors and another page for search engines - because search engines need to see what visitors will see in order to rank it properly.
Dave and Korkus, I think it's actually SEOs that have taken the word "cloaking" and created their own meaning for it.
I've been told that tech people don't use the word cloaking for any of the stuff you guys call cloaking.
That's cloaking. Jill was arguing the point too finely. Cloaking involves using the IP address or features of the HTTP request (e.g. the User-Agent field) in order to identify and deliver unique content to a search engine spider.
|In my example I wasn't using it in reference to serving pages to older browsers or handhelds. I was using it in reference to serving different optimzed pages to differen SE's. |
|Cloaking involves using the IP address or features of the HTTP request (e.g. the User-Agent field) in order to identify and deliver unique content to a search engine spider. |
You're wrong, cloaking involves using the IP address or features of the HTTP request (e.g. the User-Agent field) in order to identify and deliver unique content to a specific IP or User-agent.
|Dave and Korkus, I think it's actually SEOs that have taken the word "cloaking" and created their own meaning for it. |
I've been told that tech people don't use the word cloaking for any of the stuff you guys call cloaking.
I believe I refered to Google's definition above and Brett's post. Aren't you saying by your second line that it is an SEO term? If a group evolves the usage of a word, the new meaning agreed on by the majority of the group is the definition.
<added>What Key_Master said</added>
|That's cloaking. Jill was arguing the point too finely |
Yes, by my own personal definition and the definition you gave that would be cloaking. But by the defintion Jill gave (which I was responding too) it would not be.
|It would be user agent delivery, I believe, not cloaking. Do you need the search engine's IP to do it? If not, then yep, perfectly acceptable. |
You may think that. How could you possibly be right or wrong? Where's the definitive dictionary of SEO?
All I'm saying is, that isn't what Google or Inktomi means when they say "Don't cloak".
I call the technologies you referred to IP Delivery and User Agent Delivery. They have many applications other than cloaking. Cloaking is the application of one or both of those technologies to identify and deliver unique content to a search engine spider.
No disrespect intended Alan and Jill, but it doesn't really matter what you think cloaking is, we're at the mercy of the search engine's definition and your attempt to redefine it is just muddying the waters that Alan's article was supposed to be clearing up.
1. The people that run those search engines don't have the resources to check millions of sites to see if they are redirecting by IP or UA.
2. Not a single engine has a TOS that says UA- Based redirects are fine but we frown on IP redirects. Notice I didn't even use the word cloaking, just a nice legal IP or UA redirect, or is it?
3. The argument that somehow UA redirects are okay by the engines while IP redirects are not is absurd. People that were using UA redirects ONLY were getting tossed from indices all over the net. The result? UA and IP delivery combined.
4. Both Alan and Jill, you are basing your argument on the intent of the person using the UA delivery. While that is very noble, the search engines don't look at intent.
5. >>and serving content to the robot that you have no intention of showing to searchers
If I have many international visitors and I run a world politics site and I use IP delivery to deliver different yet relevant content to visitors andbots I'm not deceiving anyone. I'm serving up relevant content based on geographic location that is of interest to my site 's surfers. It is still cloaking Alan, but under your UA argument it should be acceptable because it serves a good purpose.
All both of you are really saying is that "cloaking can be a good thing, but please, don't call it cloaking because a whole slew of webmasters are under the impression that all cloaking is evil and if people think we cloak we'll lose clients. Please, use our definition". If it walks like a duck and quacks like a duck...
|If it walks like a duck and quacks like a duck... |
There are red wine splashes all over my monitor DG, I'm very unhappy with you. ;)
Fantastic argument. Spot on.
|No disrespect intended Alan and Jill, but it doesn't really matter what you think cloaking is, we're at the mercy of the search engine's definition and your attempt to redefine it is just muddying the waters that Alan's article was supposed to be clearing up. |
Not redefine it. Just define it more clearly.
|1. The people that run those search engines don't have the resources to check millions of sites to see if they are redirecting by IP or UA. |
|2. Not a single engine has a TOS that says UA- Based redirects are fine but we frown on IP redirects. Notice I didn't even use the word cloaking, just a nice legal IP or UA redirect, or is it? |
If you use the test I gave you, you can tell whether it's legal or not.
|3. The argument that somehow UA redirects are okay by the engines while IP redirects are not is absurd. |
Agreed, I think? UA or IP makes no odds. Are you cloaking or not?
|4. Both Alan and Jill, you are basing your argument on the intent of the person using the UA delivery. While that is very noble, the search engines don't look at intent. |
They do as Brett pointed out in the first post. That site wasn't cloaking, on closer inspection it was clear what the intent was, and it was reinstated
|If I have many international visitors and I run a world politics site and I use IP delivery to deliver different yet relevant content to visitors andbots I'm not deceiving anyone. I'm serving up relevant content based on geographic location that is of interest to my site 's surfers. |
|It is still cloaking Alan, but under your UA argument it should be acceptable because it serves a good purpose. |
No, the way I define cloaking, that's not cloaking.
|All both of you are really saying is that "cloaking can be a good thing, but please, don't call it cloaking because a whole slew of webmasters are under the impression that all cloaking is evil and if people think we cloak we'll lose clients. Please, use our definition". If it walks like a duck and quacks like a duck... |
Nope, cloaking (as I see it) is an application of a technology (IP delivery/agent-based delivery). The technology can be used for good or bad. The application of the technology for cloaking is *not* something I would do or recommend.
Search engines also use the same definition. I really don't know when or why people started applying the word to things like geo-IP delivery but it wasn't a good idea, IMO. It's caused a lot of confusion.
>>If you use the test I gave you
Doesn't matter what your litmus test is. Once again, we're at the mercy of the engines.
>>They do as Brett pointed out in the first post
Actually they had to be told what was up and luckily someone listened. That doesn't happen often.
>>the way I define cloaking, that's not cloaking
Heya! That's the point we're really arguing isn't it? Once again, doesn't matter how you define it. What really matters is what the engines will accept. If I steal a car and get caught and I tell the judge that by my definition I wasn't stealing I was merely "reallocating assets and redestributing the wealth" do you think the judge would buy it?
You see, the search engines happen to be the judge and the jury. Using your definition you could hurt a good number of people that suddenly think that the engines will allow you to use UA and/or IP delivery as long as the intent is not to deceive. Doesn't work that way and you know that so why are we discussing that approach as if it is viable?
>>I really don't know when or why people started applying the word to things like geo-IP delivery
I don't care about when but I know why. SEs were tossing sites that used any form of UA/IP delivery. We agreed that the search engines simply don't have the resources to distinguish intent. What makes you think they will spend the time to see how a site is applying UA/IP delivery?
It's fine that you don't recommend using cloaking technology but you can't just redefine the word to suit your needs.
>>cloaking (as I see it) is an application of a technology (IP delivery/agent-based delivery
Exactly, now explain to me how you are going to use a different technology for geo-IP delivery Technology is certainly the same by your definition, you just assign a different name to it.
At the risk of offending almost everyone, let me suggest that much of this discussion is useless hair-splitting. There is no universally accepted definition of cloaking, as much as some people would like one. Furthermore, even if there WAS a definition that most people accepted, it would still be irrelevant in the real world - Google et all get to define what cloaking is, or what "unacceptable" cloaking is if they determine that certain kinds of cloaking are allowable. We would be far better off discussing which techniques seem to pass muster and which don't, instead of beating semantics to death. Unfortunately, like crosslinking, there's likely to be a big gray area where some sites get stomped and others don't.
|Exactly, now explain to me how you are going to use a different technology for geo-IP delivery Technology is certainly the same by your definition, you just assign a different name to it. |
No. You don't need to know you're dealing with a search engine in order to do geo-IP delivery Technology. You don't even need to know that search engines exist, although the problems that people have deeplinking to geo-IP-delivered pages are shared by search engines.
For Geo-IP, you don't need to detact that it is a search engine by its IP address or its user agent name or anything like that - if you needed to detect that in order to deliver unique content, you'd probably be cloaking.
When it comes to Geo-IP, you just deliver to a search engine the same as you would to anyone who came from that part of the world. This creates problems, but then Geo-IP creates problems for more than just search engines.
|much of this discussion is useless hair-splitting |
Very true. However, some people are here (so it seems) to assert that they are right, and ignore that which you just mentioned -> it's useless hair splitting, and time would be best spent learning the art of SEO in more productive fashion.
To the point of the original post by Brett which started this thread, nobody can argue that cloaking isn't main stream. Sure, you can argue the definition -> which confuses lots of people, but you can not deny that cloaking is mainstream.
And I see that my comment about the site search, and MSN.co.uk cloaking was ignored by those that wish to create their own definition. :) Which is just fine by me.
If MSN is doing it, and I don't think anybody would deny that MSN makes $$$, then I, for one, will continue using it as a technique.
>>much of this discussion is useless hair-splitting
True, though I've never seen useful hair-splitting. ;)
However there are still some statement that are at odds.
>>For Geo-IP, you don't need to detact that it is a search engine by its IP address or its user agent name or anything like that
>>When it comes to Geo-IP, you just deliver to a search engine the same as you would to anyone who came from that part of the world[/b] (bold face is mine)
Those statements are contradictory. Mind telling me how you plan on "just deliver to a search engine the same as you would to anyone who came from that part of the world[/b]" without detecting IP? You are talking about IP delivery aren't you? The same IP delivery you said in the first statement that wasn't needed? You realize that bots from different countries have different IPs?
Something about your statements leads me to believe that your experience with IP delivery has been limited to writing about it without actually going through the process of implementing it. Nothing wrong with that but there are some subtle things you can do with IP delivery that I think you're missing.
In the end Alan, you can use your definition and I'll use mine. I've seen the entire world tied up in discussion over the definition of the word "is". :)
|We would be far better off discussing which techniques seem to pass muster and which don't, instead of beating semantics to death. |
Excellent suggestion, rogerd. I'd like to see a moderator start a thread on this. If it's possible to so without beating semantics to death.
|Mind telling me how you plan on "just deliver to a search engine the same as you would to anyone who came from that part of the world[/b]" without detecting IP |
You get its IP. What you don't then do is look its IP up in a database of search engine IPs, say "Ah that's a search engine" and deliver a page designed only for the search engine to see - that would be cloaking.
You can look at this as hair splitting. IMO the meanings of words in an industry is pretty important. Cloaking is given a very broad meaning by many members here, it seems. Broader than the definition given by many search engines.
I'll leave you with that. :)
|Thus, we've come full circle on search engine cloaking. We no longer cloak to deliver custom content to search engines, we now cloak for the search engines to keep them from getting at our cloaked content for visitors. |
I find this quite hilarious. Cloaking, as Brett uses it here, is a very rich term and ultimately definable in terms of functionality. That this sort of functionality is going mainstream- to serve more and more qualified content- will probably be better understood several years from now.
In this discussion, I've seen at least 3 distinct groups (no doubt, there are more):
1. Webmasters and techs who don't see the search engines as an embraceable partner- but one whose limitations constrain them (adversely) or require challenging in order to grow. "We have to live together but the search engines can't dictate the choices we make for our sites and our audience."
2. SEOs who prefer a more amicable relationship with the search engines. "We're all in it together; help us to help you and our clients (and we'll follow your preferred guidelines)"
3. Those in either camp who prefer the language be kept morally-neutral and less "loaded" (search engines aren't moral authorities; neither are we), are aware of the risks of some forms of cloaking, try to keep an independent mind on how the changes in technology prompt responses on our part to better solve a problem/satisfy client needs."Ok, the search engines aren't 'friends' but they're an important tool we work with."
I'm all for clarity so the discussion's helped me to better understand where the misunderstandings arise (my own included)- and what assumptions need to be questioned.
I also think that while it's true that the SEs aren't sophisticated enough to take into consideration "intent"; they are more highly sensitive (whether they're slow to catch it or not) to "deception". I've seen sites that use "inviso-text" that are still highly-ranked and I'm guessing it's not simply due to the popularity (which is there) but also because the "inviso-text" is part of the interactivity that goes on (and is kinda fun); leaving it up to the viewer whether or not they want to know what's revealed in the "inviso-text". No intent to deceive. There for users. Not deceiving either users or seach engines.
The better SEs are going to have to develop better algorithms to consider "context" of technology or technique. And if cloaking (in the general, "rich" sense used above) does indeed go mainstream, that seems the better response (were I a search engine). It leaves open to the rest of us who make decisions (or recommendations) about such technology- what the better choices are (for our clients, ourselves, and repercussions with the SEs).
|I've never seen useful hair-splitting. |
He, he, I guess "useless hair-splitting" WAS a little redundant, dg, though perhaps members of the Hair-splitters Union might disagree. :)
I'm still wondering who appointed you as the anti-cloaking Sheriff of the Web? :)
I guess it get's speaking jobs at the conferences though.
BUT, if you are going to define cloaking, in the terms that Google does, please do it correctly? Their take is simply this: presenting content to Google's spiders, that is different than what the end user views in a browser! It has absolutely ZERO to do w/ IP delivery, though that would be one method of cloaking.
I see so much "hidden text" stuffed into style sheets, and embedded HTML, it makes someone doing IP delivery, look like a Choir Boy! But according to you, that's not cloaking? Sure it is... It's presenting content that the search engine spider crawls, but the end user "never" sees in a browser.
I'll come clean.. I cloak! :)
Have been since 1997. Long before there was even a definition for it, and you were on the soapbox. We like to call it "IP delivery" though, because "cloaking" sounds so back room...
Jill and you like to preach, and I'm sure that goes well for conferences and consulting jobs. But, your position reminds one of University studies. They teach you in college how things should work in a "perfect world", then you graduate, hit the real world, and find out that most everything you learned in college, isn't applicable in the "real world"..
I live in the "real world".. (Hypothetical example): An ad agency calls and says a major pharmaceutical company has an allergy product that has been prescription only for the past 7 years, and the FDC is now going to allow it to be sold "Over the Counter". The Pharmaceutical company spent "5 figures" on their web content, doesn't want us to touch their content, and doesn't have great rankings for general "allergy" terms. But, wants to let the world know that their product now doesn't require a prescription. They want 300,000 visitors in 120 days, and it's not even allergy season! You think we're going to beg the pharmaceutical company to allow us to tweak their web site, and watch their rankings for allergy terms slowly rise to the top over the next six months? HARDLY! We don't have the time, they don't have the time. They want visitors NOW!
We're going to register a mirror domain, do a solid keyword analysis, target keywords that are "RELEVANT" to the client's content, and go to work, fulfilling that goal of 300K visitors before June 1.
This is the way we work. If Google doesn't like the methodology we are employing to deliver targeted traffic to the client's web site, so be it. They can choose to not index our content. But, this rarely happens.
And Google is the only major engine without an inclusion program.
You can say Inktomi takes a major stand against "cloaking", but guess what? We have full control of Titles/Descriptions/Meta content, for every URL we submit through Inktomi's "Index Connect" inclusion program. The body content matches the client's target URL, but I can guarantee you, we tune Titles, descriptions and meta tag content to best match, keyword phrases that are going to deliver quality traffic, in numbers!
Same goes for AV's Trusted Feed, FAST, and Teoma..
Welcome to the Real World, My World! Where waiting for something to happen, equates to "You Lose"! I cloak, I deliver targeted visitors to my client's relevant content, they compensate me, everyone is happy!
This is an international webmaster forum - and I've just read your post. I then looked at the two listed URLS in your profile. I live in the real world too - and I think that the 'motivation as to why' - as explained by Brett in his initial post, and others - is the difference to long term exclusion or not, after a 'human' appeal.
Both your listed URLs are PR Zero. One is gray bar - one is white bar. I don't work for a search engine - what PR were these sites last time you looked?
Which came first - cloaking or PR Zero? Have you tried to have the issue resolved?
Will a cloaked PR zero site beat an uncloaked PR6 site on a competitive keyword search?
I suppose that my answer would then be:
"join me on the dark side Luke..."
"get lost dad!" :)
paraphrased from an old Star Wars movie....
I fulfil none of the duties of a Sheriff.
I've was on one cloaking panel in about 15 SES conferences - as have people like Mikkel, Greg Boser, John Heard, Fantomaster, etc. The cloaking sessions no longer even run.
Google's definition of cloaking involves programming a webserver to return altered content to search engines. The other techniques you talked about return the same content to humans as search engines. Google doesn't define that as cloaking. The techniques you described were spam, but not cloaking.
Actually the one that is "white" is a brand new site... :)
Thank you for informing me, that there were community members from other countries. I had to put on my bi-focals to realize that some posts people used words, I wasn't familiar with.
I know I'm getting slower these days, and I appreciate you pointing that out, with the math question..
My question to you is, how big will my bank account get in the 12 months I delivered traffic to a client, while you are watching PR finally hit the radar? :)
This thread really isn't about whether it is ethically, or morally "correct" to cloak, it is about the definition of cloaking.
I've yawned my way through more than one Alan session at an SES conference, and he still doesn't get the picture?
My coming "out of the cloaking closet" was simply that. I just had to tell somebody...
"Son, A Man's got to know his limitations" :)
Stolen directly from Clint Eastwood
One should not judge any member by the PR of the site/sites in their profile or their post count.
Judge the member by the quality of his/her posts.
Since he brought up "bank account", I'd be willing to bet rezone makes more $ on SEO/Traffic Management/Site Marketing or whatever you want to call it than anyone who has ever posted here @ WebmasterWorld.
| This 129 message thread spans 5 pages: < < 129 ( 1  3 4 5 ) > > |