Hello All,
We want to block any URLs that have the ?srsltid parameter from showing up in SERPs for our ecommerce site. I have noticed that our rankings for our non-parameterized pages have dropped and want to get it correct. I am going to set up a Disallow in my robots.txt file, but is there a way I can use an X-Robots directive to block the crawlers from indexing any pages that have the ?srsltid parameter? Any help is greatly appreciated.