If I'm trying to get Google to realize that a large number of pages do not exist, would adding them with Disallow to robots.txt increase or decrease the speed of this?
I have two goals. 1. To stop Google bot visiting the pages. 2. To remove them from the index.