Could you please help me with blocking pages from being indexed through Robots.txt?
This is for a blog where the tags and values are automatically assigned and available on the home page. So when clicked on a tag, it goes to a page like
So, I want to exclude all those tags(urls)being crawled and indexed by Google. I just want to exclude everything that contains '?tagid=' How do I do that? I see that I could block all those URLs that have '?' in them through robots, but I am concerned that it might block other important pages also.
Could you please help with this?
Thank you for all and any help :-)