Welcome to WebmasterWorld Guest from 54.226.194.180

Forum Moderators: open

Message Too Old, No Replies

Pages fed to real users

add "noindex" tag?

     

rominosj

4:51 pm on May 17, 2004 (gmt 0)

10+ Year Member



Hi,

Should I add the noindex metatag to the pages fed to real users, so that only cloaked pages are indexed by SE? Otherwise, I could end up with two copies of the same page indexed and get banned, am I right on this?
So, my question is if I should use the noindex tag for pages served to real users.

Romino

volatilegx

5:06 pm on May 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I could end up with two copies of the same page indexed and get banned, am I right on this?

I wouldn't add a noindex tag. How would you get two copies of the same page indexed? An engine is either going to index one or the other, not both.

Dreamquick

5:07 pm on May 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The core of cloaking is that you are confident in your ability to distinguish the SE's crawlers from real users. If you can do this it shouldn't really matter *what* you serve up to users...

Why? Because a user should never be given the search engine content and the search engines should never be given the user content. Since each entity should only sees one "version" of the content, duplicate pages shouldn't be an issue if your cloaking tech is up to the task.

- Tony

volatilegx

5:12 pm on May 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



shouldn't really matter *what* you serve up to users...

Hehe, just don't put a noindex tag on the page you show spiders ;)

rominosj

9:55 pm on May 17, 2004 (gmt 0)

10+ Year Member



Maybe I was not clear enough on what it is I am doing and what I need.

I have created a static page with the cloaker. This static page is completely optimized for the engines, and the same text in this static page is in the page to be presented to the users. So, if the cloaked page is indexed which is what I want, I would not want to see the same text, of the page presented to the users, indexed too. So, then comes my question again: this same text in the page presented to the users should include a noindex or not a good a idea or not necessary? I won't be creating incoming links for this page presented to users, so my guess is that it won't be indexed because of lack of popularity, but I just want to make sure.
Thanks,
Romino

Dreamquick

11:08 pm on May 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Okay it's becoming a little clearer now (I think)...

It sounds like you're planning to cloak with two public pages (one for users, one for crawlers) rather than just a single public page running a script. Am I close?

If you are running two public pages then to my mind it seems likely that you're going to get the crawlers to index "Page A", then when you get a browser arriving at "Page A" you're going to use something like javascript to push them to "Page B".

...ignoring the evils of client-side cloaking for the time being...

My advice would be to "noindex, nofollow" the "user" pages *and* exclude them via robots.txt - it's no guarantee they wont be crawled, but it's pretty unlikely they'd end up in the index which should help you avoid possible duplicate content issues.

Alternately if you just have one file backed-up with a lot of scripting then all you should really need to do is check that it's only possible for the crawlers to get crawler content and users to get user content - as mentioned in earlier posts.

- Tony

 

Featured Threads

Hot Threads This Week

Hot Threads This Month