Forum Moderators: Robert Charlton & goodroi
I have read:
[webmasterworld.com...]
But That post was over a year ago...
As I read the topic I have understood that using robots.txt to exclude pages with a currency=* parameter is a bad idea. Because every inlink from other pages is lost. So any PR too.
using <meta name="robots" content="noindex" /> in the head of the pages dynamically generated when currency= is set is one of the ideas I like.
The other one is the 301 header("HTTP/1.1 301 Moved Permanently") solution on every page that was clicked by using a url with currency=.
Can someone give me more info on which is the best solution?
thanks a lot!