Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
If you use personalized delivery to protect your code, and to serve relevant information to each search engine independently, then I have no problem with it. In fact, we welcome it. The more relevant targeted and correctly build web pages we can crawl the better – our users will get better results.
We will do anything we can to fight bad cloaking – people trying to manipulate our index. We will not tolerate that. We will do anything we need to protect the quality of our product – our life.
Detecting cloaking or personalized delivery is not difficult but working out what is right and what is wrong is indeed not an easy task.
So how will we do this? I am not sure yet and if I knew I wouldn’t tell exactly how but I have a few ideas ... ;)
One of the things I believe we will need to do very soon is to detect who is cloaking and who is not. That’s step one. We could then make a note on every one of those sites and put them on a special list for manual, automated or semi automated analysis to find out what is good and what is bad cloaking.
I don’t think we would be able to check every page all the time so the analysis would only be based on small samples. So bad cloaking would still be able to pass through but at least we would catch some of the bad guys.
This way it would not be so risk-free to cloak anymore – at least not in our search engines.
Basically, if you use personalized delivery the right way then you will have no problems with us but if you do bad cloaking the wrong way then we will do whatever we can to stop you - ban all your sites or give them an extremely low ranking factor.
Only target your pages to relevant keywords. Do not use personalized delivery unless you really know what you are doing.
I don’t expect to run into any serious problems with major SEO companies. They all want to run an honest business and only target relevant traffic for their clients. They can’t afford to do it the wrong way. If they do we will get mad at them and their clients will end up with low quality visits. So I don’t believe they will.
In fact, I will not only let “the good guys” use personalized delivery but I will go into a direct dialogue with the most professional ones of them. They know a lot about search engines that we can benefit from and I will help them do what they do even better. Helping the “good guys” is my way of trying to stop the bad ones.
I will speak at the IMS2000 seminar in Stockholm on the 26th of October and hope to bring some new exciting tools and statistics with me that I think you will all find useful in your hunt for more relevant traffic :-)
Thank you for sharing your thoughts on this important and controversial issue, it still leaves me with a nagging question though.
It seems that you accept that by it's very nature "personalised delivery" will serve different content to the user than that which the spider sees. Where do you draw the line, what is an acceptable "difference"?
Is a page of random text with keywords inserted at the "appropriate" density, targeted at a keyword that is relevant to the site, an example of "good cloaking"?
Well, NFFC, you did promise me some tuff questions and you sure know how to pick them <g>
There is no doubt that we will need to draw a line. As I said, I don't want to ban all cloaking - just the bad one. I am also sure that every SE will find their own line and it will not be the same.
What I am offering is an open debate about where to draw that line and how to secure that only "good cloaking" is getting through to our indexes and getting ranked. I am not sure about it yet and I need your input on this.
What do you think, NFFC?
How should we determine relevancy if it was up to you?
What would you consider relevant?
When is it spam?
Of course everybody wants to say this is spam. I know I do but if I really stop and consider it, there can be a very strong argument that it is not spam. If the surfer is not being mislead and they are actually being given a relevant site then where is the problem?
I know it's not as easy as just saying that misleading cloaking is bad but it's a very lurry line.
>What do you think, NFFC?
It doesn't matter, it is what you think that is important.
>How should we determine relevancy if it was up to you?
My sites should be the most relevant.
>What would you consider relevant?
>When is it spam?
When it's somebody else's site. ;)
To be more constructive, my definition of "good cloaking" is that which seeks to protect the code, in which the content delivered to both spider and user is "substantially" the same.[You could even come up with an acceptable percentage.]
For me this rules out flash sites, sites that consist mainly of graphics and dynamic sites. If these types of
sites want/need to rank well then let them make the effort to create static HTML alternative versions, the decision is theirs.
The difficulty you face is in setting the limits of what is acceptable, people will always push at the envelope, you need to clearly and unequivocally define the boundaries.
I take the first part of your post, NFFC, as an example of how spammers think <g>. People that want their sites to turn up - be relevant - for every kind of search is by definition spammers.
I much prefer the second part of your post:
"To be more constructive, my definition of "good cloaking" is that which seeks to protect the code, in which the content delivered to both spider and user is "substantially" the same."
The challenge, off course, is to define "substantially the same".
One of the problems is that many dynamic sites changes so often that even if we visit them from to different IPs within seconds they would not be the same. We need to hit them at excactly the same second - and even then the randomization that some sites use will confuse us. Some sites also use IP detection to serve different language versions - even some of the large SEs do that (e.g. Lycos).
So far I have not been able to come up with automated way to detect this - we still need to do some manual work.
I do not agree with you regarding Flash, graphic or dynamic sites. Our users want to find them so we need to be able to crawl them and understand them in ordeer to be able to rank them. That's another challenge :)
>I take the first part of your post, NFFC, as an example of how spammers think
It is difficult for some web site owners to understand how an SE works, they all think that their sites are the most relevant. I get comments such as "but my prices are cheaper than theirs, why should the other site rank better".
This bring us to what I think should be your ultimate concern, stopping spam. We all may have different definitions of spam, but my point is that it doesn't matter weather it is cloaked spam, dupe doorways or anything else spam is spam. If you go the route of singling out cloaked domains for special attention the danger is that you in some way make it easier for spam to "slip" through.
>I do not agree with you regarding Flash..That's another challenge
It was a rant after all. The challenge shouldn't be yours, that cost should be borne by the web site owners if they want to be listed well at your SE.
Personalised delivery is here to stay. There are just to many good reasons for implementing it. It might be serving pages according to language. Lycos is doing it today as well as many others. A German surfer using the url lycos.com will be redirected to lycos.de. Others redirect according to the browser you use to make the visual experience as pleasant as possible. Professional SEO companies use personalised delivery or cloaking to protect their knowledge from competitors, own and clients. Professional SEO’s don’t try to achieve good ranking under unrelated search terms for their customers, because that will bring traffic of low quality and degenerate the value of the very service that makes life good for an SEO specialist.
There will always be people that try to take advantage of others, in this mater as well as other. Those of us that are serious about our business, search engines as well as SEO companies, have a mutual interest in keeping the search engine indexes as clean and relevant as possible. Only if we work together is this possible.
Maybe their is a need for an organisation to host Search engines and SEO companies in their mutual strives to mature their respective business. By putting up basic rules for the trade and co-operating in the enforcement, can we make sure that this issue is solved in an orderly fashion.
Mikkel, Thank you for bringing up this question. :)
We get a lot of very entertaining (not to say funny) e-mails from users here at Kvasir, I promise you <g>. Some write me and tell me (more or les order me) that they want their site to turn up in the top when users search for this and that. I most often answer with an URL for SEF, I-Search,this forum or my own forum in Danish and tell them that it's no problem to do that, they just need to read a few hundred thousand posts and articles on the subject and they will rank very well :)
As most SEs we have not payed enough attention to the cloaking issue in the past. We will now. That does not mean that will drop the focus on other kinds of spam. We will keep focusing on every kind of spam, any attempt to manipulate our indexes and destroy our business.
Please feel free to drop in at the European Search Engines Forum at any time. By Mid-October I will roll out a pan-European effort to chart the entire European SE scene and all able hands will be needed.
But it would be great if you could get some other people from other SE to enter into this topic.
i just hope that a clear definition of what cloak spam is can be agreed upon by the various SE. I think we all agree that cloak spam (and spam in general) is the common enemy.. the rest is just competition:)
This might be wishful thinking but it sure will let us all sleep a whole lot better.
I like the attitude of that Redzone dude ;)
I don't cloak nearly as much as I used to, but it still has it's place. Having been the victim of page jacking thousands of times, there is little question that cloaking is almost mandatory on top competitive keywords. It is nice for a SE to realize that we are almost forced to do it not merely to raise rankings, but to protect the code.
It really didn't hit home with me until some one swipped 300 pages of content from SEW. Within a month my rankings on that site were gone. I started searching for my rare obfuscated keywords and found over two dozen copies of the site on the net - all cloaked. I assume they were trying to understand my theme'ing approach to broadbased rankings.
So, I cloak not because I want to, but because I have to.
I understand why you and others need to cloak. That's why I find it important that we work out a way for you to do what you do and still be able to stop spammers. However there is still a long way to go ... :)
A lot of questions have come up ...
If cloaking is OK why isn't invisible text too?
What if someone copy my pages and then cloak them?
How do you secure that cloaked pages is not pure spam?
...etc etc ...
We need to answer these and many more questions before we can claim to have a good solution - but at least things are moving now :)
>If cloaking is OK why isn't invisible text too?
There's nothing wrong with invisible text, invisible spam is the problem. A few years ago, I often saw pages containing a laundry list of words in invisible or nearly invisible text. Usually, the words were unrelated to the content of the page and apparently included to attain good rankings on a variety of keywords. I also saw pages where one keyword was repeated hundreds of times to target one keyword. I suspect banning pages for invisible text was much simpler then determining if the text was relevant to the page. And it's not a bad assumption since invisible text is unlikely to be useful except to those seeking high rankings.
As in redzone's first rule, if it isn't relevant, ban it! :)
I hope that other SE's will join you in here as Brett's forum seems to be an accumulation of the very top and very serious SEO specialists.
Personally, I know I will learn a bunch from them and we all can learn a bunch from you and all other SE's while working together for this cause of which we all have so much in common.
Thank you sincerely.
I have been hearing those arguments for a long time, I would not dismiss them outright, but consider that each of those statements is bourne out of equating cloaking with spam as though they were joined at the hip.
Cloaking is first and foremost a delivery mechanism, nothing more, unless that separation is made, those arguements will always be stumbling blocks to an open discussion that leads to SE's being able to index a wider range of relevant content, and publishers of that content being able to offer it for indexing.
The "mee too" arguement of those wishing to use invisible text for this purpose is irrelevant, it is a separate arguement, and raises considerations that must be contemplated by the SE's which are entirely different than cloaking.
The issue of page theft raises an entirely different set of questions. How do we stop it? , How are pages stolen in the first place? Can stolen pages be submitted to Search Engines? If cloaking was entirely wiped off the face of the earth, would page theft increase or decrease?
The last point is one that I think exists only because the myth persists that cloaking hides something from the search engines. This leads many to believe that the webmaster is free to spam at will with impunity while cloaking. The reality is that the "cloaked" page submitted to the search engine carries no disguise, it is the same page that would be submitted regardless of the delivery mechanism. It is subject to the same spam filters and the same algorithm that every other page is. If invisible text gets you banned, then delivering invisible text to the search engine via cloaking will too. If too many repetitions of keywords carries a penalty, then delivering too many keywords via cloaking is penalized too.
It isn't my intention to refute each argument that is raised against cloaking. I do so above only to ensure that what looks to be a promising and exciting dialogue with the Search Engines is not thwarted. It is unheard of to engage in this type of dialogue with the search engines. Let's forget for a moment that it is about cloaking, instead let's use it as a precedent that will later create a forum to address some of the other concerns that have been raised and IMO unnecessarily tied to cloaking.
Who knows, in the end the search engines may decide that cloaking should in fact be eradicated from the face of the earth. But wouldn't it be great to have reached that position through a collaboration of the SEO community and the Search Engines?
Exactly my point :)
In response to my question at IMC2000 last Thursday the Altavista and Inktomi representatives agreed to your proposal that they should participate in open discussion on cloaking. Will you contact them or should Brett send out invitations quoting you?
BTW: I hope I quoted you correctly in my article, posted in this forum. If not, please post the corrections you feel are necessary, I won't mind. This is an important matter and getting it right is more important than any possible bruises to my ego. ;)
Only to make it clear:
I think we all agree that we (SEO) want the same as them (SE): relevant results. And it´s not important whether one can see the source-code or not - it´s important that the results related to my search-keyword.
I am looking forward to that time in the future when a spammer would be banned at once from all SEs.
I think you quotes looks just fine, rencke.
I will take contact to AltaVista and Inktomi and invite them to participate in the discussions. However, you must understand that this is new to most SEs and they will not feel 100% secure about it at first. So I believe that part of the discussions will have to take place under more private forms with special invited people in the beginning – but I will try and have them participate here too, and on I-Search and Search Engine Forums, when relevant discussions take place.
The most accepted justification for cloaking is prevention of copyright theft or page-jacking. As long as search engines mainly use on-the-page ranking criteria, I guess this will always be the case. But one thing's for sure: savvy page jackers and copyright thieves will, themselves, always be cloakers. Copyright theft is against the law and copyright theft, page jacking and particularly bait and switch should be against “Internet law” and policed by “the community”. Cloaking makes the policing more difficult. Universal cloaking makes the policing almost impossible. This is one reason why cloaking should be outlawed.
Cloaking can also be used to protect reverse engineered code. I don't agree with reverse engineering, and here's why. As a search engine, I would want to be able to differentiate myself from my competitors. It's already the case that different search engines attract different profiles of visitors. But some SEOs (or rather, their clients) want to be at the top of every search engine listing, destroying a search engine's ability to offer truly objective relevant results. These clients prefer to kludge the search engine into a pure marketing device, but this model is cracked. The search engine spends megabucks on creating and building a brand, the SEO earns kilobucks reverse engineering the search engine and getting their client to the top regardless of true relevance to the search term. This path can only end in clients paying the search engines directly. If all search engines went this way, all the nice fees charged for SEO consultancy would be replaced by small commissions on paid placement spends. So reverse engineering will ultimately do long-term damage to the SEO business.
The role of an SEO should be to educate their clients in the issues and help their clients to structure a site to be well indexed by search engines. I think this goes as far as working out their clients’ keywords and ensuring they appear on appropriate pages, advising on issues such as frames, Flash, CGI, dead links, stale links and taking account of all the information given by the search engines themselves. Much more than this would be reverse engineering. Hot news today is that reverse engineering is illegal – take a look at [nytimes.com ]:o!
Perhaps the best example of the problem with cloaking is "authoritative links". A link from your page to a known authority on your search terms will get your site a boost on many search engines, right? Especially if that authoritative site links back to your page. If you cloak, then on your uncloaked pages the authoritative links make no sense because the search engine is not reading your uncloaked pages. But maybe you put them there to convince the authoritative site to link back to you. Which they do - on their uncloaked pages. But if they cloak, why should they link back to you on their cloaked pages? They are already authoritative, you’ve just made them a bit more so, but why should they make you more authoritative by association with them, which could ultimately lower their own ranking? And, logically, why should you link to an authoritative site from a cloaked page anyway? You are only doing it to gain a boost from the authoritative link. So you are tweaking the algorithm severely. Authoritative links on cloaked pages are as bad as invisible text on uncloaked pages – meaningless except to get a ratings boost. Authoritative links on cloaked pages are spam.
Will cloaked pages containing authoritative links be given a spam penalty? They should be. Therefore, SEOs will have the choice of admitting a page is cloaked, omitting authoratative links, and taking the relevancy drop, or denying the page is cloaked, including authoratative links and taking the risk.
One issue that has been raised in other threads – the idea of separate submission forms for cloaked and uncloaked pages. As well as causing confusion, I think an obvious guerrilla tactic would be to submit competitors’ cloaked pages through the uncloaked submission form, causing (in Mikkel’s words) severe penalties. The problem of authenticating the user of an Add URL page is a much better one for search engines to solve.
I’ve thought about this a lot over the years, but one thought came to me while writing this post. Universal cloaking would make reverse engineering almost impossible, and that might be a good thing. OTOH, it might force the blinkered to use hundreds of URLs and, effectively, reverse engineer off their own cloaked pages rather than the Web as a whole. This would be a waste of the search engine’s time and resources. Universal cloaking would also require mass upgrading of servers, which would take several years. By then, search engine algorithms should have moved away from being heavily reliant on on-the-page criteria so cloaking should cease to be the issue that it is today. The challenge to Mikkel and other search engines is to be better so that those SEOs who find cloaking to be essential now will not find it to be so in future. Advocating universal cloaking is not the answer, it's surrender.:)
You make some compelling arguements, and very well.
>But one thing's for sure: savvy page jackers and copyright thieves will, themselves, always be cloakers.
Yes it is unfortunate, but you are right, well ranking pages that are stolen are often subsequently cloaked by the page-jacker.
I am unclear on whether you are suggesting that both cloaking and SEO reverse engineering are a manipulation of the search engines' algorithm, do I read that correctly?
On the subject of cloaked authoratative links, I can suggest what I do when cloaking. I always place a link on both the cloaked page and the human visible one.
>But if they cloak, why should they link back to you on their cloaked pages?
Well, because it's the right thing to do, and because any SEO worth their salt will immediately know that they are not receiving credit for the link in the search engines.
>The challenge to Mikkel and other search engines is to be better so that those SEOs who find cloaking to be essential now will not find it to be so in future. Advocating universal cloaking is not the answer, it's surrender.:)
Very well said, IMO it is only through the dialogue that is being contemplated between SEO's and the Search Engines that this can happen. It just so happens that cloaking is providing that impetus. Any solution that advocates universal cloaking is not the answer.