| 7:01 pm on Aug 2, 2004 (gmt 0)|
|Is this Ethical & the proper use of cloaking? |
I assume you mean according to Google's definition? Maybe. It's tough to know how they really stand on cloaking, because they seem to tolerate it in so many sites, yet completely decry it on their pages outlining what spam is. My feeling is that it would be acceptable to them.
However, this begs the question: can their cloaking detection methods tell the difference between acceptable and unacceptable cloaking? Probably not.
| 4:00 pm on Aug 8, 2004 (gmt 0)|
Here is something to think about, I would personally be irritated if I was searching for something and was asked to pay for it before I even saw it. So in that sense I think Google would be against this since it would yield to people that were unhappy searchers. Technically it is cloaking, although seems borderline. Based on how much cloaking goes on that is tolerated by Google, I would think this would get by, and I doubt people would report it, because it doesnt seem like blatant cloaking.
Just some thoughts. good Luck.
| 3:47 pm on Aug 9, 2004 (gmt 0)|
I would, some how, be upfront about letting people know that this is not free content; perhaps show something in the title/description.
Another idea would be to let SEs index some general content, as a result, let people see that general content and offer them the additional content for a fee.
My dilemma is that I would like to do this in an ethical manor for SEs and our customers.
| 4:06 pm on Aug 9, 2004 (gmt 0)|
This may be (a little) off topic, but I notice that in Google News there exist stories for which a subscription to the news service is required to access them.
It seems to me that Google usually displays a warning that a subscription is required, also. I wonder if there is a meta tag used to inform Google of that or some other criterium that Google picks up on?
I think you ought to email Google and ask for an opinion for your situation.
| 5:43 pm on Aug 9, 2004 (gmt 0)|
Google News is where I got this idea from, so I don’t think it’s off topic.
If you search for something contained within Webmasterworld’s subscriber only forums, you can usually find it on Google but only paid customers can get to the content. Webmasterworld does this by having “blurbs” on the index pages.
That is one of the approaches I was considering.
Another would be detecting if a bot or user is visiting and show them the appropriate content. The advantage to this approach would be that the SEs would have all of our content but the disadvantage would be that this IS cloaking and could get me banned.
| 7:12 pm on Aug 9, 2004 (gmt 0)|
I didn't mean your post was off topic. I meant my reply might be off topic :)
| 3:55 pm on Aug 10, 2004 (gmt 0)|
|Google News is where I got this idea from, so I don’t think it’s off topic. |
I was under the impression that Google News operates on the basis of subscriber submitted news feeds. This is more secure as anyone can fake useragent.
IMHO the only proper use of "cloaking" is that achieved by personalisation for automatically logged in users, they will see somewhat different content for the same URLs.
| 7:27 pm on Aug 10, 2004 (gmt 0)|
"IMHO the only proper use of "cloaking" is that achieved by personalisation for automatically logged in users, they will see somewhat different content for the same URLs."
-- So, if SEs see 1 version of a page, with less content & a customer that is logged in, can see more/full content, would that be acceptable?
| 12:08 pm on Aug 11, 2004 (gmt 0)|
|So, if SEs see 1 version of a page, with less content & a customer that is logged in, can see more/full content, would that be acceptable? |
I think that would depend on whether the customer will see fuller but still relevant content to the one shown to SE. Say Newspaper X might have short abstracts of articles on its frontpage, but getting full article requires purchasing access and logging in. As long as final content is relevant to the one advertised openly then it should be fair play. In either way spider bot can't login and verify that content is different. What should matter is that the content is not changed depending on useragent.
| 2:00 pm on Aug 16, 2004 (gmt 0)|
Thanks for sharing your thoughts on this. It will help us make a better decision to keep our customers & SEs happy.
| 10:18 pm on Aug 18, 2004 (gmt 0)|
I know some will not agree with me, but it is ethical as long as you think it is. There is no law against cloaking, you are not doing anything illegal, you are just exploiting flaws in the system without actually causing harm to anyone. If you think it is un-ethical and morally cannot stand to see it, or feel bad everytime you look at it, remove it. When it all boils down to it, it's really entirely up to you.
| 6:35 pm on Aug 21, 2004 (gmt 0)|
Have you considered offering most of the article as free (enough to get you hooked :), and then the remainder of the article would be available to members only?
If you give say half and half. The first half would allow people to get the jist of the article and also enough content for spiders to index.
Just a though... at least it does not require cloaking methods.
| 9:31 pm on Aug 21, 2004 (gmt 0)|
I have done exactly what you are considering. It worked wonders since September 2002, but last week I got banned and had my entire site dropped from Google. It was a very profitable period, but it sure does suck being out of the game now. I had about 40,000 subscriber pages indexed and it generated lots of traffic. The main keyword which had been my main income stream before, suddenly became all but irrelevant - I got hits directly to my articles rather than to my main page. MUCH more potent, as users looking for a specific article are more likely to pay than users going to the more generic main page.
I personally consider indexing subscriber pages completely ethical. If I hadn't done it, some of my users would never have found my pay-only pages. Since I have unique content which my users are willing to pay for... I don't see the harm. For me it's just like the subscription pages on google news. If google had an option to submit subscriber only pages, I would have used it.
The technique I used is described here:
Sticky me if you have any comments. I wish I had used an SEO consultant the first time around. I presume he might have convinced me to move the cloaked pages to another domain/ip which just might have prevented me getting dropped.
| 5:53 am on Sep 6, 2004 (gmt 0)|
I have seen sites that if the user is not logged in(also a bot) they are given a portion of the page. This gives content for SE and information for user if they want to join/log in.
If a SE has access to your site, it could be cached and users would read the cache. The above approach is not cloaking but checks to see if a user is logged in. This would seem to be a reasonable solution you might try.