Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
Should I add the noindex metatag to the pages fed to real users, so that only cloaked pages are indexed by SE? Otherwise, I could end up with two copies of the same page indexed and get banned, am I right on this?
So, my question is if I should use the noindex tag for pages served to real users.
Why? Because a user should never be given the search engine content and the search engines should never be given the user content. Since each entity should only sees one "version" of the content, duplicate pages shouldn't be an issue if your cloaking tech is up to the task.
I have created a static page with the cloaker. This static page is completely optimized for the engines, and the same text in this static page is in the page to be presented to the users. So, if the cloaked page is indexed which is what I want, I would not want to see the same text, of the page presented to the users, indexed too. So, then comes my question again: this same text in the page presented to the users should include a noindex or not a good a idea or not necessary? I won't be creating incoming links for this page presented to users, so my guess is that it won't be indexed because of lack of popularity, but I just want to make sure.
It sounds like you're planning to cloak with two public pages (one for users, one for crawlers) rather than just a single public page running a script. Am I close?
...ignoring the evils of client-side cloaking for the time being...
My advice would be to "noindex, nofollow" the "user" pages *and* exclude them via robots.txt - it's no guarantee they wont be crawled, but it's pretty unlikely they'd end up in the index which should help you avoid possible duplicate content issues.
Alternately if you just have one file backed-up with a lot of scripting then all you should really need to do is check that it's only possible for the crawlers to get crawler content and users to get user content - as mentioned in earlier posts.