homepage Welcome to WebmasterWorld Guest from 54.205.144.54
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

    
Is this Ethical & the proper use of cloaking?
ethical and proper use of cloaking
Aman




msg:678118
 5:44 pm on Aug 2, 2004 (gmt 0)

I have a site with thousands of pages of content that my customers pay for. I would like to let Google and other SEs crawl my content, allowing people to search for it but once they click thru, they would need to log in (if they were a previous customer) or pay to view this content.
I would use some method of detecting bots or browser and show them the appropriate content. I have read a few methods of doing this, detecting IP address / JavaScript / Cookie, etc.

I have other parts of my site that is very successful on search engines and don’t want to risk being banned but I have tons more content that is not indexed, because I have not tried clocking yet.

Is this Ethical & the proper use of cloaking?

 

volatilegx




msg:678119
 7:01 pm on Aug 2, 2004 (gmt 0)

Is this Ethical & the proper use of cloaking?

I assume you mean according to Google's definition? Maybe. It's tough to know how they really stand on cloaking, because they seem to tolerate it in so many sites, yet completely decry it on their pages outlining what spam is. My feeling is that it would be acceptable to them.

However, this begs the question: can their cloaking detection methods tell the difference between acceptable and unacceptable cloaking? Probably not.

tfanelli




msg:678120
 4:00 pm on Aug 8, 2004 (gmt 0)

Here is something to think about, I would personally be irritated if I was searching for something and was asked to pay for it before I even saw it. So in that sense I think Google would be against this since it would yield to people that were unhappy searchers. Technically it is cloaking, although seems borderline. Based on how much cloaking goes on that is tolerated by Google, I would think this would get by, and I doubt people would report it, because it doesnt seem like blatant cloaking.

Just some thoughts. good Luck.

Aman




msg:678121
 3:47 pm on Aug 9, 2004 (gmt 0)

I would, some how, be upfront about letting people know that this is not free content; perhaps show something in the title/description.

Another idea would be to let SEs index some general content, as a result, let people see that general content and offer them the additional content for a fee.

My dilemma is that I would like to do this in an ethical manor for SEs and our customers.

volatilegx




msg:678122
 4:06 pm on Aug 9, 2004 (gmt 0)

This may be (a little) off topic, but I notice that in Google News there exist stories for which a subscription to the news service is required to access them.

It seems to me that Google usually displays a warning that a subscription is required, also. I wonder if there is a meta tag used to inform Google of that or some other criterium that Google picks up on?

I think you ought to email Google and ask for an opinion for your situation.

Aman




msg:678123
 5:43 pm on Aug 9, 2004 (gmt 0)

Google News is where I got this idea from, so I don’t think it’s off topic.

If you search for something contained within Webmasterworld’s subscriber only forums, you can usually find it on Google but only paid customers can get to the content. Webmasterworld does this by having “blurbs” on the index pages.
That is one of the approaches I was considering.

Another would be detecting if a bot or user is visiting and show them the appropriate content. The advantage to this approach would be that the SEs would have all of our content but the disadvantage would be that this IS cloaking and could get me banned.

volatilegx




msg:678124
 7:12 pm on Aug 9, 2004 (gmt 0)

I didn't mean your post was off topic. I meant my reply might be off topic :)

Lord Majestic




msg:678125
 3:55 pm on Aug 10, 2004 (gmt 0)

Google News is where I got this idea from, so I don’t think it’s off topic.

I was under the impression that Google News operates on the basis of subscriber submitted news feeds. This is more secure as anyone can fake useragent.

IMHO the only proper use of "cloaking" is that achieved by personalisation for automatically logged in users, they will see somewhat different content for the same URLs.

Aman




msg:678126
 7:27 pm on Aug 10, 2004 (gmt 0)

"IMHO the only proper use of "cloaking" is that achieved by personalisation for automatically logged in users, they will see somewhat different content for the same URLs."

-- So, if SEs see 1 version of a page, with less content & a customer that is logged in, can see more/full content, would that be acceptable?

Lord Majestic




msg:678127
 12:08 pm on Aug 11, 2004 (gmt 0)

So, if SEs see 1 version of a page, with less content & a customer that is logged in, can see more/full content, would that be acceptable?

I think that would depend on whether the customer will see fuller but still relevant content to the one shown to SE. Say Newspaper X might have short abstracts of articles on its frontpage, but getting full article requires purchasing access and logging in. As long as final content is relevant to the one advertised openly then it should be fair play. In either way spider bot can't login and verify that content is different. What should matter is that the content is not changed depending on useragent.

Aman




msg:678128
 2:00 pm on Aug 16, 2004 (gmt 0)

Thanks for sharing your thoughts on this. It will help us make a better decision to keep our customers & SEs happy.

Blue Gravity




msg:678129
 10:18 pm on Aug 18, 2004 (gmt 0)

I know some will not agree with me, but it is ethical as long as you think it is. There is no law against cloaking, you are not doing anything illegal, you are just exploiting flaws in the system without actually causing harm to anyone. If you think it is un-ethical and morally cannot stand to see it, or feel bad everytime you look at it, remove it. When it all boils down to it, it's really entirely up to you.

webforumz




msg:678130
 6:35 pm on Aug 21, 2004 (gmt 0)

Have you considered offering most of the article as free (enough to get you hooked :), and then the remainder of the article would be available to members only?

If you give say half and half. The first half would allow people to get the jist of the article and also enough content for spiders to index.

Just a though... at least it does not require cloaking methods.

new_shoes




msg:678131
 9:31 pm on Aug 21, 2004 (gmt 0)

Hi Aman,

I have done exactly what you are considering. It worked wonders since September 2002, but last week I got banned and had my entire site dropped from Google. It was a very profitable period, but it sure does suck being out of the game now. I had about 40,000 subscriber pages indexed and it generated lots of traffic. The main keyword which had been my main income stream before, suddenly became all but irrelevant - I got hits directly to my articles rather than to my main page. MUCH more potent, as users looking for a specific article are more likely to pay than users going to the more generic main page.

I personally consider indexing subscriber pages completely ethical. If I hadn't done it, some of my users would never have found my pay-only pages. Since I have unique content which my users are willing to pay for... I don't see the harm. For me it's just like the subscription pages on google news. If google had an option to submit subscriber only pages, I would have used it.

The technique I used is described here:
[webmasterworld.com...]

Sticky me if you have any comments. I wish I had used an SEO consultant the first time around. I presume he might have convinced me to move the cloaked pages to another domain/ip which just might have prevented me getting dropped.

XtendScott




msg:678132
 5:53 am on Sep 6, 2004 (gmt 0)

I have seen sites that if the user is not logged in(also a bot) they are given a portion of the page. This gives content for SE and information for user if they want to join/log in.

If a SE has access to your site, it could be cached and users would read the cache. The above approach is not cloaking but checks to see if a user is logged in. This would seem to be a reasonable solution you might try.

xtendscott

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved