Forum Moderators: goodroi

Message Too Old, No Replies

duplicate pages .aspx?id=

Not sure what to do next

         

schatzm

5:35 am on Dec 5, 2006 (gmt 0)

10+ Year Member



I am no well versed in asp.net so forgive the archaic description

Our site is currently indexing 300 + of the wrong pages.

For instance…. Google is indexing

[site.com...]

The correct page is listed below.

[site.com...]

From what I understand the site is largely cookie driven. Is there any way to use a robots.txt file to disallow the pattern in the URL such as

Disallow: /*?id/

Would this work? Does anyone have any recommendations?

Will google index the pages from our sitemap.txt that do not link from the homepage?

[please no specific urls]

[edited by: goodroi at 11:32 pm (utc) on Dec. 5, 2006]

goodroi

9:01 pm on Dec 6, 2006 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Welcome to WebmasterWorld schatzm!

you can use the wildcard in your robots.txt to block google from indexing the wrong page but it does not resolve your main issue. your main issue is how google found those pages. you might have some people linking to your site using bad links or maybe some internal links are bad. i'd dig deeper into your site and find out what is going on.

cheers