Forum Moderators: Robert Charlton & goodroi
- By putting the disallow directive in robots.txt --- Would it stop Googlebot from picking up the pages and removing them? Would they hang in limbo forever because Googlebot cannot get to them? Would the URLs remain in the index?
... Or ...
- Would Googlebot read the directive and immediately remove the pages? Not leaving them hanging.
Appreciate all help.
Thanks.
Old urls may hang around as Supplemental Results for many months (even years) in my experience. Even using the Google url removal tool only gets rid of their visible presence. Six months later, they still can be back. Even if the urls are 404!