Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
If you use personalized delivery to protect your code, and to serve relevant information to each search engine independently, then I have no problem with it. In fact, we welcome it. The more relevant targeted and correctly build web pages we can crawl the better – our users will get better results.
We will do anything we can to fight bad cloaking – people trying to manipulate our index. We will not tolerate that. We will do anything we need to protect the quality of our product – our life.
Detecting cloaking or personalized delivery is not difficult but working out what is right and what is wrong is indeed not an easy task.
So how will we do this? I am not sure yet and if I knew I wouldn’t tell exactly how but I have a few ideas ... ;)
One of the things I believe we will need to do very soon is to detect who is cloaking and who is not. That’s step one. We could then make a note on every one of those sites and put them on a special list for manual, automated or semi automated analysis to find out what is good and what is bad cloaking.
I don’t think we would be able to check every page all the time so the analysis would only be based on small samples. So bad cloaking would still be able to pass through but at least we would catch some of the bad guys.
This way it would not be so risk-free to cloak anymore – at least not in our search engines.
Basically, if you use personalized delivery the right way then you will have no problems with us but if you do bad cloaking the wrong way then we will do whatever we can to stop you - ban all your sites or give them an extremely low ranking factor.
Only target your pages to relevant keywords. Do not use personalized delivery unless you really know what you are doing.
I don’t expect to run into any serious problems with major SEO companies. They all want to run an honest business and only target relevant traffic for their clients. They can’t afford to do it the wrong way. If they do we will get mad at them and their clients will end up with low quality visits. So I don’t believe they will.
In fact, I will not only let “the good guys” use personalized delivery but I will go into a direct dialogue with the most professional ones of them. They know a lot about search engines that we can benefit from and I will help them do what they do even better. Helping the “good guys” is my way of trying to stop the bad ones.
I will speak at the IMS2000 seminar in Stockholm on the 26th of October and hope to bring some new exciting tools and statistics with me that I think you will all find useful in your hunt for more relevant traffic :-)
> I am unclear on whether you are suggesting
> that both cloaking and SEO are a
> manipulation of the search engines
> algorithm, do I read that correctly?
Cloaking is not a manipulation of the algorithm, but it can be used to hide a manipulation. The term SEO covers a wide spectrum of activities, some of which include manipulation of the algorithm - but to be an SEO does not necessarily mean you manipulate the algorithm. By the way, some manipulation, along the lines I mention before, is good - it's reverse engineering I have the problem with.
> On the subject of cloaked authoratative
> links, I can suggest what I do when
> cloaking. I always place a link on both
> the cloaked page and the human visible
Very good, but let's be clear. You don't do that to ensure the same content is seen by search engine and human, you do it to get the maximum boost possible. In other words, you wouldn't include links that you knew would worsen your position. Would you? And in search engine terms, I think it is spam - which I would define as anything done purely to artificially manipulate a ranking. Since humans don't see the page, there can be no need for external links. In fact, if you have a deep submission policy or a site map, you don't need internal links on a cloaked page either.
> Any solution that advocates universal
> cloaking is not the answer.
I agree that universal cloaking is not the answer. I think zero cloaking is the answer. We will gravitate one way or the other. If search engines permitted cloaking now, they would struggle to ban it again in future - so this is a big deal.
By the way, I think personalised delivery is fine, but any user on any platform should be able to see any version of a page if they request it and their platform supports it. Those users include users in any country (subject to legal restrictions), and those platforms include PCs, WAP phones, TVs and search engines - all identified by their User-Agent.
I can agree, if I can suggest that no amount of manipulation is good. Under the model you propose only some methods of manipulation are to be frowned upon. IMO Any technique that would analyze a page for search engine suitability, thus increasing it's chances of better placement is to be shunned. Likewise for auto engine submitters, because their only purpose is to submit large quantities of pages, otherwise it would be convenient to submit by hand, the engines would find the rest.
>Very good, but let's be clear. You don't do that
>to ensure the same content is seen by search engine
>and human, you do it to get the maximum boost possible.
No, I do it on the human visible page so that the site I am linking to rightfully receives traffic from the link. I place a matching link on the cloaked page so that the site I am linking to rightfully gets credit for it at the search engines. The boost is a byproduct for anyone that enters into reciprocal linking agreements.
>In other words, you wouldn't include links that
>you knew would worsen your position. Would you?
Whatever links appear on my human page appear on the equivalent cloaked page too. You are right though, I try to stay away from linking to sites that would worsen my position. I would imagine you do as well?
>I agree that universal cloaking is not the answer.
>I think zero cloaking is the answer.
You may be right.
Read the article and am at a complete loss to understand how you can draw a conclusion regarding reverse engeneering from what it says - even for the US, and much less for the rest of the world.
It seems to me that a low level US governement agency has made an interpretation of US law, that would have to be upheld by the US supreme court first of all and then turned into an international convention after that - we're talking 8 years minimum for this, whatever it is they have decided upon.
Me and my colegues analyzes the search results in order to be able to present more relevant information to the SEs, not to spam them.
As for relevancy, I would not position any client for a search phrase not mentioned on their site.
Allan you have a lot of good points there. But you are not as neutral as you try to point out. You represent a software package that is competing with SEO. I have not tried it, so I will not put any judgement, but there are people who classify it as spamming software...
With that in mind I understand that you think that all SEO is spam and manupulative or whatever wording used.
Whole point is that it is not the technique used, but the words that you are targeting. If you are relevant it is OK to cloak or use other ways to rank high. I cloak soly to protect the hard work I put down on reversed enginering. Back in -97 I had my code ripped off by a competitor - that will not happen again.
You are a good moderator.:)
> if I can suggest that no amount of manipulation is good
“Manipulation” is a bad word. If an SEO helps a client to structure their page to make its theme/keywords more clear, this should help the client, the search engines and Web users. When you get to the level of keyword density analysis, this is IMO reverse engineering (unless the search engine tells you on a help page what keyword density it is looking for!).
If, at the search engine, the act of performing a search establishes a context, and that context is preserved and built upon by a well structured page following a click through to that page, then things are working. But this can all be achieved without IP cloaking. IP cloaking is only done for bad or to prevent bad – but doing it to prevent bad is surrender by the SEO, and agreeing to it is surrender by the search engine. I know cloaking is a Star Trek term, but I prefer the Star Wars analogy - anyone who cloaks has moved to the "dark side" of The Force. Darth Vader wears a cloak!
Now to the question of authoritative links and spam. Answer these questions in order, and let me know which one you answer “No” to, and why.
1) Is spam “anything done purely to artificially manipulate a ranking”?
2) Is there any reason for putting authoritative links on an IP cloaked page, other than to artificially manipulate a ranking of either your site or the site you are linking to?
I hope you see my point. The “better way” I think will involve some sort of agent based delivery where, as I said before, any user on any platform should be able to see any version of a page if they request it and their platform supports it.
Hello. I guess your response shows the problem with law. Something that can be seen from more than one perspective is open to interpretation. Here is my summary of the relevant points as I see them:
“The United States Copyright Office on Friday endorsed a new federal law making it illegal to ... use so called reverse-engineering to understand how a piece of technology works. The statute goes into effect immediately. The ruling, issued by the Library of Congress, which oversees the copyright office, will be in effect for three years, during which the copyright office will continue to examine its effect.”
I’m no lawyer, but I would be surprised if a search engine (that wanted to) couldn’t find a lawyer to use this ruling to prosecute anyone reverse engineering their algorithm. True, it’s only in the US - but that’s where a lot of the search engines are. An interesting debate is whether someone outside the US is breaking US law if they do something on a US search engine that is illegal in the US – but that’s another thread...
Hello. I hope the above answers a lot of your points. I am not against SEO generally, in fact I'm very much for it, but I am against bad SEO, which I agree needs to be very well specified. The biggest problem with IP cloaking is that it is so often used to hide bad SEO. Maybe we can move the debate onto defining what is SEO, what is good SEO, and what is bad SEO...
Alan, you are stretching the purpose of this law beyond reasonable limits. I can envisage an extension of the international copyright convention to protect books and music. But the algo of an SE? No way.
Under international law, national law ends at the border. No country has jurisdiction in another country. If what you do is legal in your own country, and illegal in another, it is the law of the country where you do it that applies as long as there is no binding treaty between the two countries covering the activity.
E.g. US pornographers are distributing their wares to a number of countries where the stuff is strictly forbidden, but there is no way to get at them. Because what they do is legal in the country where they operate.
Returning to the main topic, it think it's time the debate moved on to defining the activities that might come under the umbrella term "SEO". Then Mikkel and other SEs can work out whether they are happy with each activity. Finally, an informed decision can be made about cloaking, which is in itself passive, but can be used to do wrong or to protect against wrong. Cloaking seems very analogous to a weapon and, in these wild early days of the Web, it is up to us as a self-governing society to decide whether we want to sanction such a weapon.
So, why don't some SEOs pitch in by attempting to classify what they do?
I like open debate, and I believe that, in the last months the community here has managed to do some incredible things, getting dialogue going with search engines, discussing cloaking "mainstream" with Ink, AV, and others.
All of us here are either involved in the SEO business, the search business, or a related field. We are interdependent, therefore, none of us can say, "I'm not biased." I'll admit mine freely, and then move on.
Cloaking will exist, so long as there is no other way to rank well for good content, and not get ripped off by some clown with less than perfect ethical standards. But aside from cloaking, how about sites that utilize link spamming techniques that have nothing hidden about them at all? These sites don't get penelized, but they degrade the web, because they don't provide the quality information that we are all after.
In the end, I bow before the wisdom of those who've been here longer, and in the business for many years. I appreciate their advice, posts, and the fact that because of them, this forum exists. I suggest, in my humble opinion, people who arrive here, pushing specific products of their own, respect the community. And don't try to convince us you have no agenda, we all do. That is why we are here.
Absolutely. But is that comment aimed at me? Because I'm not here to push a product (I haven't even mentioned one), and I do respect the community, enough to spend the time giving the contribution I have so far. I honestly believe that if SEOs go on cloaking and reverse engineering, much of SEO will pretty much die, and search engines will end up along the lines of Goto (paid placement) or Northern Light (research), neither of which gives an SEO the earning potential of the current range of engines.
I'm only here because Mikkel invited me here at IMS2000. Mikkel assured me this was a professional forum. So please don't get personal just because I take an interest and because my point of view isn't exactly the same as yours.
> And don't try to convince us you have no
> agenda, we all do. That is why we are
I set out my agenda in the first line of my first post. I have an interest in search engine users getting the truly most relevant results for each query they perform. That's why I wanted to contribute to this discussion.
Now, it really is time to move on...
Nobody get's too "personal" here, a great bunch of individuals. Have you participated at other marketing forums with the same professional level of response?
IMO cloaking will not be what turns SE's into Goto "look-a-likes", it's the need for revenue streams and profitability that are driving the SE's this direction. With banner ad CPM costs decreasing, the revenue stream has to come from somewhere else.
Usage of information has to be paid for "somehow"..
AOL, SNAP, Metacrawler are just three current examples of portals looking for alternative revenue streams to be able to continue to provide search index information to users at no cost to them.
Relevant information has been a hot topic, and many condemn the SE's that they are not producing relevant results. I always tell those individuals that it's time to take another "reality check". The average consumer using a search index, is satisfied if they find one to two listings in the top 10 for their "search phrase".. They click, they move on, they got what they came for, simple as that.
Anyone that worries about all 10 results of the first page being totally relevant to the search term, is either an information professional (librarian, document analyst,etc), or not ranked in the top 10 for that search term... :)
This what Mikkel said, this is what we are discussing:
"What I am offering is an open debate about where to draw that line and how to secure that only "good cloaking" is getting through to our indexes and getting ranked. I am not sure about it yet and I need your input on this."
I lurk her most of the time, saving my responses for when I feel I can really add something. That said, I have valued you comments. I disagree with some of your beleifs, but I'm glad you're here. Nuetral or not. All of us are biased, if for no other reason, than that we've already drawn some conclusion about something in life, but that doesn't make our comments become value-less.
Redzone, right on brother...some people just get too caught up in the little things. If you can't see the forest for the trees, go to "grandmother's house" or something, just quit bumping into me;-) (just a stupid joke there, forgive that) I agree that anyone thinking that all 10 results should be "completely relevant" (whatever that means) should try to place themselves in an SE's position and see if they can program a better algo!!
Mikkel, I think you made a big mistake holding the debate in this forum, despite the fact that many of its inhabitants are skilled and well informed. If you wanted balanced opinions, then a forum with the title "Cloaking - Stealth" wasn't the place to get them. I would guess a few people you invited last week are lurking with horror.
Thanks for your sentiments, Dave and Redzone, and don't worry - I can handle it.
I don't mind reasoned arguments, but I still don't think my fundamental question has been answered:
Cloaking without spamming implies that you should not use authoritative links. Will cloakers give up using authoritative links? No. Why? Because they will lose out to uncloaked pages. The whole point of cloaking is to hide "optimised code". Nobody seems too worried that page-jackers might steal their "unoptimised code", also known as their Web site, and the reputational damage that might cause.
I think we've strayed a long way from where I started this debate, and most of it has been in the wrong direction. I think you might at least reach an answer more quickly if I wasn't getting in the way. And I can get back to my product development! So goodbye, good luck to you all, and Happy Halloween, cloakers!;)
I agree that cloaking can be used for good/evil, and I think that, done properly, it doesn't hurt a search engine, but improves relevance. That is what we're all after, in the end.
Perhaps our solution is already here, pooling the talents of people on all sides of the fence. I don't think cloaking needs to be stopped, but I do know that abuse needs to stop. The very fact so many of us are in this kind of business shows that people from all sides are interested in solving the problem, and coming together, like this, is only the first step.
Anyone see Dragon, the Bruce Lee story? The actor drops a pebble into some water, with the other actor playing his wife, and as they watch the ripples, he says, "see. It has begun."
Anyway, I edited out the comment. My point was just that you are not as neutral as you first stated.
(sorry henki, I was in the middle of splitting the thread and was locking it just while you posted henki - brett)
>Cloaking without spamming implies that you should
>not use authoritative links. Will cloakers give
>up using authoritative links? No. Why? Because
>they will lose out to uncloaked pages.
Alan I think on this one we are re-ploughing the same field, so I will refer you back to my previous answer. Unless there is a new point to be made?
Thank you for that Air, sometimes the obvious is obscured, your words have changed my view regarding "personalised delivery".
If nothing is hidden from the SE's how can they object?
First of all, I don't think we finished off ploughing that field.
Answer these three questions in order. Which one do you first answer “No” to, and why.
1) Is spam “anything done purely to artificially manipulate a ranking”?
2) Is there any reason for putting authoritative links on an IP cloaked page, other than to artificially manipulate a ranking of either your site or the site you are linking to? (You said before something like the linked to site "rightfully gets credit", which is IMO an artificial manipulation of the rankings since the ONLY reason the link is there is to boost a ranking)
3) Are authoritative links on an IP cloaked pages spam?
Next question: you cloak to prevent page-jacking of your optimised code, but how much effort do you put into discovering page-jacking of your unoptimised code, i.e. your Web site? Isn't this almost equally important?
Next question. If search engines provided you with the resources to combat bad cloaking, page jacking and bait and switch, such that you could never suffer from the bad effects of cloaking if you didn't use it, would you still cloak? Suppose search engines even notified you by e-mail if they found pages substantially similar to yours? Would you still cloak? Why?
What if search engines reversed the way they currently treated duplicate sites, so the oldest survived rather than the youngest?
Am I getting there yet or are you firmly in Vader's grasp...?
"Artificially Manipulate" is a very broad term... Please constrict your definition a bit.
You seem to really be hung up on "authoritative links" as the only key to SEO today.. I don't use a one, and have thousands of top 10's across several SE's.
If the same "authoritative link" would appear on the page the search engine user views when they click through, then "no", it's not spam IMO.
When you refer to un-optimised, most of us represent "end clients", and most of mine have Attorney's on board to handle those issues. They would rather have me concentrating on getting them targeted SE traffic, than playing traffic cop.... I've never turned in "one" so-called spammer in almost five years of SEO... I believe in "karma", and the world is definitely round, so what goes around, comes around... :) Besides, my time is more valuable functioning in positive areas, representing my cliennts.
The SE's won't take the time/resources/expense to install this functionality, so it's up to me to protect my investment/technology/income. I've been doing "personalized delivery" for longer than the nasty word "cloaking" has been around. (Such a cheap sounding definition for what we do).. :)
I'm lost on your idea of the way the major SE's currently handle duplicate web sites.... In most cases, the "oldest" are the survivors...
Now, I've got a question for you.. I've spent thousands of hours building technology, automation, reporting to handle thousands of keywords, for unlimited clients, and protect my code... What do you have that is going to make me want to throw my system in the "recycle bin" ??? :)
>I think it is spam - which I would define as anything done purely to artificially manipulate a ranking.
and, the way I read it, his arguments are all stemming from that definition, attempting to clarify what "artificially manipulate a ranking" means. And I think the definition needs work. It seems to me that including metatags on a page or adjusting keyword density could fall under a liberal interpertation of Alan's definition and so the definition is unsatisfactory to me...
Personally, I like redzone's definition from an earlier post:
1. The content is targeted to the optimized keyword phrase
2. One listing per SE per keyword phrase
Meet these criteria and it's not spam.
But my intent with this post is to try to foster understanding, not promote a particular definition. I submit that we need to agree on what spam is before discussing ways to accomplish/avoid it.
>Am I getting there yet or are you firmly in Vader's grasp...?
I may be Vader, or is it Luke? Hard to say, both are extremes representing good and evil. I prefer something in between;)
I largely agree with redzone's responses, so I'll address the following, rather than present similar arguements.
>.....Suppose search engines even notified you by e-mail
>if they found pages substantially similar to yours?
>Would you still cloak? Why?
Yes, I would. The engines cannot determine relevancy yet, they can only create a mathematical representation of relevency when they encouter text that has been formatted using hypertext markup. That lagnguage was never intended to represent anything but text documents, and search engines are built to be efficient at indexing those types of documents.
The web is a much more expansive visual medium than just plainly formatted text. I could not imagine going back to designing a site to please the engines and be less than satisfied with the aesthetic elements that are pleasing to people, or even to have to think about it.
There is also no substitue for building a relevant page for a search engine without having to touch the human visible one. This is a fact that many SEO's face. Some clients simply will not let you touch their site, even if it is only to make changes of the type you have stated to be acceptable optimization.
I won't kid you, I also like targetting each specific engine with relevant pages in the style they like to index. IMO this is necessary because each engine is at a different point of evolution with their algorithm. I don't believe they have done this to differentiate themselves from each other, rather it is a reflection of their history, financing, and talent.
So let's presume that this discussion is still about discussion with the search engines. If it isn't going to be about cloaking, and let's presume that as a start it can only be about one thing, what would that one thing be, that begins to bring the SE's and the SEO's together in discussion to establish some open policies?
joined:July 21, 2000
<Some clients simply will not let you touch their site, even if it is only to make changes of the type you have stated to be acceptable optimization>
In an ideal world, SEOs would be brought in at the design stage of preparing a web site. This has not happened to me yet - clients (both large and small) come to me often angered by what they feel has been Internet hype. Me telling them to redesign their sites has (in the main) fallen on deaf ears - many web designers seem to feel we are practicing 'black arts' - and, when I have been able to explain why and how things should be done to optimize sites this is rarely implemented by the web design team.
I, too, have also had code taken and web design companies work with me for 3 months then set themselves up as SEOs - but that is not really my main justification for considering cloaking.
My primary reason is that a typical client is often a well respected name in a particular field, they should be found on a search engine for a search phrase in that field but are often not. If someone uses that term and reaches my client's site which provides a satisfactory solution to the query surely all parties benefit, the user, the search engine - and my client. The very act of submitting to a search engine is an attempt to manipulate the listings, how is writing copy which is appealing to both SE and user any different provided both are provided with a relevant and satisfactory response to the searchers query?
I can not change the client site until the client respects my knowledge, that can't be done until I have proved what I can do, so the only way I can provide this service is by creating a 'cloaked' site where I try and adhere to the basic rules which I think are 'ethical' and echoed elsewhere by others - these are to only optimize for terms truly relevent to the target site (and I have refused many terms for this reason) and serve 1 result per SE per term.
I am excited by this potential dialogue between SEs and SEOs and do feel that we should somehow create a 'code of understanding'. In some ways I consider that our industry must be a little like advertising in the early days of TV - the equivalent of spam being subliminal advertising and outrageous boasts. Eventually rules and standards were laid down - and the same will happen here.
This is our opportunity to forge these guidelines and hopefully build an industry which is respected by both clients and the SEs - but, as always this is IMHO. The experience of people in these forums is far beyond mine and I have read the opposing arguments with interest and understand both sides of the discussion. An alternative to what I currently do would always be considered.
You don't use authoritative links? aka reciprocal links? How do you get on at Google?
"oldest" survivors - the chief complaint I hear about page jacking is "Those guys ripped off all my pages, SE re-spidered, sees duplicate sites, thinks my site's moved and WHAM! - my listings are replaced by the page jackers'". I can see that might happen, unless SEs corrected their algo.
Thanks for your input! I think only by working on very tight definitions can we move forward. My definition includes the word purely, which is designed to make it very tight. The Meta Keywords tag has probably been the most abused tag ever, but I think it has its place in pointing out to a search engine key concepts/themes/keywords that the page is designed to express. If I were writing a SE, I might choose ONLY to index a page using the keywords in the META Keywords tag, when spidering highly commercial, competitive areas of the Web. There is absolutely no reason, on a page not designed to be read by humans, to have a link to an external site - other than to gain a boost from that link. This is directly analogous to putting invisible text on a page that can be seen by humans.
"The engines cannot determine relevancy yet"? Who does, then? Using what criteria? I don't think the engines want you to determine their relevancy for them. If that's what this discussion is about, then we will end up at Goto. Which happens to be one of my favourites.
Let's look at it another way. The engines cannot determine relevancy - yet. How are they to improve, when every time they change their algo, SEOs tweak their optimised code to preserve the status quo?
Some great and thoughtful comments. I think a lot of work needs to be done in client education and building trust in SEOs. This can only be achieved if all is open. Remember I don't represent a search engine, but I(personally) have no problem with agent-based delivery. What I have a problem with is IP cloaking. Everything should be open for everybody to inspect - that's respect for the community. And, rather than sanctioning IP cloaking, search engines need to find another way to penalise page jacking, copyright theft and bait and switch.
There are more reasons to cloak then just to hide - targeting is another. I run a SE and I don’t think that targeting pages to our spider is wrong. In fact, I believe that the more targeted pages that our spider finds the easier it will be for us to build a good index – as long as it’s not spam.
If we get better pages for our spider then we will be able to serve more relevant results on more different queries to more happy users giving us more traffic to send to you ...
... and we all end up happy :)
I don’t care what webmasters chose to show us or other SEs or if they use cloaking to do so. As long as we can se the page (we can) and are able to decloak (we are) then I don’t see why it should be such a big issue. Basically I still believe the real challenge is spamming - not cloaking.
Do you think that there would be no more pages stolen if there was no cloaking?
It seems to me, Alan, that you have you focus on a few details that you defined. Details that I don’t even think have much to do with cloaking.
This thread was an attempt to try and give this fine community and everyone involved in cloaking an opportunity to speak on the subject. I invited you here, Alan, because I welcome your input too even though I disagree with many of your conclusions :)
Weather we – at SOL/Kvasir/Ereka – will allow cloaking or not by the end of the day I don’t know yet and we will finally be the ones to take our own decision – but we will base it on what is said here and in the other forums where this issues is being discussed.
I think the engines want help by way of having pages presented to their spider in a format that allows them to categorize and rank what the page is about. If that page contained mostly images, or film clips, without help it would never show up in the search engines. Likewise for pages using flash, or SE unfriendly design in general.
>How are they to improve, when every time they change
>their algo, SEOs tweak their optimised code to preserve >the status quo?
What can I say Alan, humans are competitive, so are search engines. If no one optimized, there would just be a different status quo.
Let me leave you with a thought. If the search engines were participating in this discussion, would we have captured the moment and advanced the capability of search engines, publishers, and/or humankind in general with our discussion thus far?
Let me formalise my position:
1) No problem with Agent-Based delivery.
2) Big problem with IP cloaking because it can be abused, and distinction of abuse from non-abuse is made more difficult by the cloak.
3) IP delivered (for non-cloaking purposes) pages are probably unsuitable for indexing by current search engine technology. Anyway, an IP contains far less information than a User-Agent, and there are very few uses for pure IP delivery other than security.
4) If we operate in a totally open environment, there is far less chance for abuse than in a cloaked environment.
5) Search engines should be free to calculate relevancy and should provide tools to police and report abuse. Being open and honest should not be a disadvantage.
So, a question for SEs. If being open and honest is currently a disadvantge, what are you proposing to do about it?
Even though you don't think so many sites uses IP delivery for other reasons then cloaking so I hold on to my statement above.