Forum Moderators: Robert Charlton & goodroi
I have a site, which has short movies in it. Guest users can only see the main index page, which has links to different movie pages, each of which contains basic information about each movie, and a discussion forum that users can discuss that movie and use other features... Users need to register/signup (registration is completely free), and have to login/authenticate to see movie pages and use all other features.
So, I would like google to index these movie pages as well, and I was thinking about authenticating google bot by it's IP...
In theory, the page I am showing to users and google is same, but users need to authenticate to see that page, where google boot is implicitly authenticated. Once again, since registration is free, the content I am showing to registered users and google, which will be another regitered user, would be the same...
I know that New York Times and many other people do this. But, do you guys this would create a problem and get my site banned? Would this be considered as cloaking?
Thanks...
But if Google crawls your restricted content, people will be able to access it. Surely you're going to use 'noarchive' option in robots meta tag, but still some content may be accessed by snippet.
But if the only thing you're going to protect are actually links to movie files to download, you have no need to cloak a whole page with these links - just configure server to give the file with film to authorized users only.
Better to make the information plainly visible and then require registration in order to view the movies or participate in the forums. (In fact I can't see a valid reason for you to require registration in order to watch the movies either, but that's up to you and at least that doesn't involve misleading Google users.)
Think about it. You click on a search result and immediately get hit with a login screen instead of what you are looking for. You don't fill out the form, you hit the back button.
That is why the nytimes *doesn't* do what you suggest as far as the normal search goes. They do it for Google news, but they are allowed to there.
Do a search on [site:nytimes.com] and start clicking on the links while you are not logged in. I tried a couple dozen, and never got the login page.
I wouldn't do it. It will just annoy your potential users.Think about it. You click on a search result and immediately get hit with a login screen instead of what you are looking for. You don't fill out the form, you hit the back button.
I agree. When I see such results I'm so annoyed for wasting my time on clicking to this, that I'm halfway to fill the Google spam report in such cases :))
I know of a very credible SEO firm with some large clients who does something which is more or less the same
SEO firms do many things we wouldn't consider save - they have to risk if they want to be the best. Perhaps we should ask Googleguy what he thinks about such kind of cloaking. I'm afraid Google will not like it.
Keyword stuffing increases keyword density, while serving more SE optimized content increases text/code ratio perhaps. Both serve Googlebot different things than to the user and both serve things more SEO optimized.
It's matter of time when Google start detecting this, but SEO have to risk it to earn as much as possible, because otherwise they would earn nothing.
For example, at one point, I removed 2 of my standard navigation links when I was serving a spider. I didn't want to go to JS links, because I know of several of my users that surf with JS off.
I was not serving the spider anything extra, and anyone doing a manual check on the site would quickly see why I was doing it.
But I certainly would not tell anyone else that it is safe to do that. But it is certainly safer than what is usually meant by "cloaking".
I wouldn't do it. It will just annoy your potential users.Think about it. You click on a search result and immediately get hit with a login screen instead of what you are looking for. You don't fill out the form, you hit the back button.
I agree. When I see such results I'm so annoyed for wasting my time on clicking to this, that I'm halfway to fill the Google spam report in such cases :))
I must admit to frequently filling in spam reports for such sites, especially scientific journals and newspapers. I know Google doesn't act on them, but somewhere, someone might notice lots of them and consider a policy change.
I agree that everyone has the right not to share content with everyone. I don't doubt that some content is worth money. But it's not right to make anything available to Google if it's not available with just as much ease to me.
I know Google doesn't act on them, but somewhere, someone might notice lots of them and consider a policy change.
I don't know whether someone read my reports, or it might have just been chance, but I have reported several pages that require logins and many of them have disappeared from the SERPs within a couple of weeks.