Forum Moderators: goodroi

Message Too Old, No Replies

robots.txt disallow

How do I ensure re-direct pages are not duplicated?

         

robinantill

11:05 pm on Feb 18, 2006 (gmt 0)

10+ Year Member



I have a Directory which uses PHP and generates a link through the program like this http://www.example.com/the-directory/links/1476 (if you right click on the link that shows up)

When you click on this link it adds an extra hit to the website and then goes to the website linked with this re-direct. If you then click on 'cached snapshot of this page' you get the message
'cache:http://www.example.com/the-directory/links/1476 - did not match any documents'
however I would expect in time this would be indexed and cached.

This will then cause a problem with a duplicated web page in Google. You would have http://www.example.com/the-directory/links/1476
and
http://example.com/ in Google which would be exactly the same.

I want to ensure this does not happen, so if my robots.txt file was as follows:-

User-agent: *
Disallow:
Disallow: /the-directory/links/

would this stop Google indexing the re-direct page?

Thanks for any help.

[edited by: engine at 12:23 pm (utc) on Feb. 24, 2006]
[edit reason] examplified [/edit]