What's the difference between adding the parameters in Google webmaster tool as 'No. Doesn't affect page content' where any URLs with this string will not be crawled and blocking the same query string through Robots.txt? Are they same?
What if I want this query string pages appearing in Google search be removed? Blocking it thorough robots.txt will still have the pages in Google index but with a snippet saying that 'A description for this result is not available because of this site's robots.txt – learn more'
Adding canonical, or noindex is not possible where there are n number of pages. How would you overcome such situations? Will adding the parameter exclusion through Google webmaster tool help?
Thanks for the help!