I have disallow tags and author links coming from SERP
sharelocalbusiness
6:17 am on Apr 15, 2015 (gmt 0)
I am using this format to block tags and author pages from my website result, because of Google update as it said " links coming from tags will be taking as duplicate pages" . Am i doing right?
It might have worked if you had done it from day 1. But google already knows that the pages exist, and what's on them, so it's kinda too late now. All you've done is prevent them from crawling and hence learning about any future cleaning-up you might happen to do.
Very near the top of my "things it took me years to wrap my brain around" list is this: Crawling and indexing are different things. Anything you put in robots.txt only affects crawling. You may, instead, need to look at indexing.