Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
I have other parts of my site that is very successful on search engines and don’t want to risk being banned but I have tons more content that is not indexed, because I have not tried clocking yet.
Is this Ethical & the proper use of cloaking?
Is this Ethical & the proper use of cloaking?
I assume you mean according to Google's definition? Maybe. It's tough to know how they really stand on cloaking, because they seem to tolerate it in so many sites, yet completely decry it on their pages outlining what spam is. My feeling is that it would be acceptable to them.
However, this begs the question: can their cloaking detection methods tell the difference between acceptable and unacceptable cloaking? Probably not.
Just some thoughts. good Luck.
Another idea would be to let SEs index some general content, as a result, let people see that general content and offer them the additional content for a fee.
My dilemma is that I would like to do this in an ethical manor for SEs and our customers.
It seems to me that Google usually displays a warning that a subscription is required, also. I wonder if there is a meta tag used to inform Google of that or some other criterium that Google picks up on?
I think you ought to email Google and ask for an opinion for your situation.
If you search for something contained within Webmasterworld’s subscriber only forums, you can usually find it on Google but only paid customers can get to the content. Webmasterworld does this by having “blurbs” on the index pages.
That is one of the approaches I was considering.
Another would be detecting if a bot or user is visiting and show them the appropriate content. The advantage to this approach would be that the SEs would have all of our content but the disadvantage would be that this IS cloaking and could get me banned.
Google News is where I got this idea from, so I don’t think it’s off topic.
I was under the impression that Google News operates on the basis of subscriber submitted news feeds. This is more secure as anyone can fake useragent.
IMHO the only proper use of "cloaking" is that achieved by personalisation for automatically logged in users, they will see somewhat different content for the same URLs.
-- So, if SEs see 1 version of a page, with less content & a customer that is logged in, can see more/full content, would that be acceptable?
So, if SEs see 1 version of a page, with less content & a customer that is logged in, can see more/full content, would that be acceptable?
I think that would depend on whether the customer will see fuller but still relevant content to the one shown to SE. Say Newspaper X might have short abstracts of articles on its frontpage, but getting full article requires purchasing access and logging in. As long as final content is relevant to the one advertised openly then it should be fair play. In either way spider bot can't login and verify that content is different. What should matter is that the content is not changed depending on useragent.
If you give say half and half. The first half would allow people to get the jist of the article and also enough content for spiders to index.
Just a though... at least it does not require cloaking methods.
I have done exactly what you are considering. It worked wonders since September 2002, but last week I got banned and had my entire site dropped from Google. It was a very profitable period, but it sure does suck being out of the game now. I had about 40,000 subscriber pages indexed and it generated lots of traffic. The main keyword which had been my main income stream before, suddenly became all but irrelevant - I got hits directly to my articles rather than to my main page. MUCH more potent, as users looking for a specific article are more likely to pay than users going to the more generic main page.
I personally consider indexing subscriber pages completely ethical. If I hadn't done it, some of my users would never have found my pay-only pages. Since I have unique content which my users are willing to pay for... I don't see the harm. For me it's just like the subscription pages on google news. If google had an option to submit subscriber only pages, I would have used it.
The technique I used is described here:
Sticky me if you have any comments. I wish I had used an SEO consultant the first time around. I presume he might have convinced me to move the cloaked pages to another domain/ip which just might have prevented me getting dropped.
If a SE has access to your site, it could be cached and users would read the cache. The above approach is not cloaking but checks to see if a user is logged in. This would seem to be a reasonable solution you might try.