Welcome to WebmasterWorld Guest from 34.239.158.107

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Showing Different Content to Google Bot

     
3:33 am on Jun 20, 2005 (gmt 0)

New User

10+ Year Member

joined:Feb 20, 2005
posts:26
votes: 0


Hi,

I have a site, which has short movies in it. Guest users can only see the main index page, which has links to different movie pages, each of which contains basic information about each movie, and a discussion forum that users can discuss that movie and use other features... Users need to register/signup (registration is completely free), and have to login/authenticate to see movie pages and use all other features.

So, I would like google to index these movie pages as well, and I was thinking about authenticating google bot by it's IP...

In theory, the page I am showing to users and google is same, but users need to authenticate to see that page, where google boot is implicitly authenticated. Once again, since registration is free, the content I am showing to registered users and google, which will be another regitered user, would be the same...

I know that New York Times and many other people do this. But, do you guys this would create a problem and get my site banned? Would this be considered as cloaking?

Thanks...

9:21 am on June 21, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Feb 15, 2005
posts:380
votes: 0


I'd say this would be cloaking indeed, but not all cloaking is bad. Only cloaking in purpose of deceipt SE and boost the rankings is against rules.

But if Google crawls your restricted content, people will be able to access it. Surely you're going to use 'noarchive' option in robots meta tag, but still some content may be accessed by snippet.

But if the only thing you're going to protect are actually links to movie files to download, you have no need to cloak a whole page with these links - just configure server to give the file with film to authorized users only.

3:50 pm on June 21, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 13, 2005
posts:135
votes: 0


What happens if you change IP addresses/DNS at the same time and one Googlebot with the new IP/DNS hits the domain with the correct content while another Googlebot still uses the old IP/DNS and hits different content on the old server (a blank page or even a modified page)? So in summary, Googlebot1 gets one set of content for the domain while Googlebot2 gets a different set of content. Would this be considered cloaking? If so, if the domain was already under a penalty for duplicate content (misconfigured webserver), would this push it off the index completely and result in a perm ban?
4:52 pm on June 21, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 6, 2002
posts:4768
votes: 0


I loathe this kind of thing in principle, but I think this case underscores a problem with the conceptualization of your website. If people can't see any of the content available to them, why would they bother to register anyway? How would they know whether it was worth the headache or not?

Better to make the information plainly visible and then require registration in order to view the movies or participate in the forums. (In fact I can't see a valid reason for you to require registration in order to watch the movies either, but that's up to you and at least that doesn't involve misleading Google users.)

5:01 pm on June 21, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member bigdave is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 19, 2002
posts:3454
votes: 0


I wouldn't do it. It will just annoy your potential users.

Think about it. You click on a search result and immediately get hit with a login screen instead of what you are looking for. You don't fill out the form, you hit the back button.

That is why the nytimes *doesn't* do what you suggest as far as the normal search goes. They do it for Google news, but they are allowed to there.

Do a search on [site:nytimes.com] and start clicking on the links while you are not logged in. I tried a couple dozen, and never got the login page.

5:45 pm on June 21, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 2, 2005
posts:147
votes: 0


I know of a very credible SEO firm with some large clients who does something which is more or less the same as this - essentially sniffing for googlebot and "cloaking" the pages to make them more google friendly - technically friendly, as opposed to dubious techniques such as keyword stuffing, etc. I know for sure they would never do anything that would risking getting their clients banned, so I suspect this is ok.
6:46 pm on June 21, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Feb 15, 2005
posts:380
votes: 0


I wouldn't do it. It will just annoy your potential users.

Think about it. You click on a search result and immediately get hit with a login screen instead of what you are looking for. You don't fill out the form, you hit the back button.

I agree. When I see such results I'm so annoyed for wasting my time on clicking to this, that I'm halfway to fill the Google spam report in such cases :))

I know of a very credible SEO firm with some large clients who does something which is more or less the same

SEO firms do many things we wouldn't consider save - they have to risk if they want to be the best. Perhaps we should ask Googleguy what he thinks about such kind of cloaking. I'm afraid Google will not like it.

Keyword stuffing increases keyword density, while serving more SE optimized content increases text/code ratio perhaps. Both serve Googlebot different things than to the user and both serve things more SEO optimized.

It's matter of time when Google start detecting this, but SEO have to risk it to earn as much as possible, because otherwise they would earn nothing.

7:41 pm on June 21, 2005 (gmt 0)

New User

10+ Year Member

joined:June 16, 2005
posts:18
votes: 0


Can one remove a javascript include for instance?

I did it once but for a specific bot I ran on my own server, not for Googlebot (I wouldn't risk it).

8:26 pm on June 21, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member bigdave is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 19, 2002
posts:3454
votes: 0


Certain types of cloaking are *probably* okay, as long as you are serving google what is basically the same thing you are serving your users.

For example, at one point, I removed 2 of my standard navigation links when I was serving a spider. I didn't want to go to JS links, because I know of several of my users that surf with JS off.

I was not serving the spider anything extra, and anyone doing a manual check on the site would quickly see why I was doing it.

But I certainly would not tell anyone else that it is safe to do that. But it is certainly safer than what is usually meant by "cloaking".

8:30 pm on June 21, 2005 (gmt 0)

Senior Member from MY 

WebmasterWorld Senior Member vincevincevince is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 1, 2003
posts:4847
votes: 0


I wouldn't do it. It will just annoy your potential users.

Think about it. You click on a search result and immediately get hit with a login screen instead of what you are looking for. You don't fill out the form, you hit the back button.

I agree. When I see such results I'm so annoyed for wasting my time on clicking to this, that I'm halfway to fill the Google spam report in such cases :))

I must admit to frequently filling in spam reports for such sites, especially scientific journals and newspapers. I know Google doesn't act on them, but somewhere, someone might notice lots of them and consider a policy change.

I agree that everyone has the right not to share content with everyone. I don't doubt that some content is worth money. But it's not right to make anything available to Google if it's not available with just as much ease to me.

9:06 pm on June 21, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member bigdave is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 19, 2002
posts:3454
votes: 0


I know Google doesn't act on them, but somewhere, someone might notice lots of them and consider a policy change.

I don't know whether someone read my reports, or it might have just been chance, but I have reported several pages that require logins and many of them have disappeared from the SERPs within a couple of weeks.

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members