Forum Moderators: open

Message Too Old, No Replies

How to make Google NOT index the same page with DIFFERENT urls

The "duplicate content" problem

         

Arnett

3:53 pm on Aug 22, 2003 (gmt 0)

10+ Year Member



I am having a problem with the fact that [wwWebmasterWorldebhost.com...] & [widgets.com...] are ALL appearing in Google's index and could appear to Google as duplicate content. Since they are both the SAME FILE how do I keep google from indexing [wwWebmasterWorldebhost.com...] & [widgets.com...] and make it index only [widgets.com?...] I don't want a penalty for duplicate content?

That also presents another question. If every page in Google is appearing with up to 3 different urls then are they really "Searching over 3 billion pages"?

TheTruth

8:04 pm on Aug 22, 2003 (gmt 0)

10+ Year Member



I'm not sure if I understand your post correctly, but if you are wondering about G indexing www.widgets.com and widgets.com separately, I recently posted the very same question. The suggestions I got were informative, but didn't really go on to explain how G handles all these instances of the same page (in my instance www.widgets.com and widgets.com are listed separately, but are different snapsnots of my index.htm). Posted suggestions told me that www. should be considered as a subdomain and I got various suggestions on how to redirect surfers to either one of www. or non-www.
In my case I would prefer to have G keep in the index my www version and either update or delete my non-www version. I'm lost as to what will work the best.

Gus_R

12:52 am on Aug 23, 2003 (gmt 0)

10+ Year Member



If you want to take out duplicates put different pages in each url and include the meta "noindex".

Searching over 3 billion pages

They cached more, the message is hardcoded.

futureX

9:50 pm on Aug 23, 2003 (gmt 0)

10+ Year Member



It realy depends on PR and backlinks, as long as everyone is pointing their links @ [widgets.com,...] then the other two will fall away.