Forum Moderators: Robert Charlton & goodroi
In my experience using Disallow via robots.txt will remove your page as well. But it takes a long long time. First the cache and snippet will be removed but your page is still listed without description in the index. Additionally the page can still come up in the SERPs....
Therefore... use metatags...
URL removal tool will also work but I have heard of problems using this tool.
To remove the pages from their index you need to submit the URL of the robots.txt file to the Google URL console. The pages will then be removed within days, and will stay out of the index for 6 months. They will continue to stay out only if the pages are still mentioned in the robots.txt file after that time.
Alternatively, the robots meta tag will see pages dropped from the index within a matter of a week or so.
410 gone should cause it to be removed though.
Thanks for all the input guys - Personally I would use the robots.txt submission method but I was just wondering...
Alternatively, the robots meta tag will see pages dropped from the index within a matter of a week or so.
Or not...
I have a website I took down, leaving the pages up there with a META robots noindex.
The pages went supplemental within a week, and have stayed supplemental for 6 months now...
Since it's a free ISP, I have no access to ROBOTS.TXT or 404s on that particular site, so I've modified the pages to point to my new site, and am still waiting for them to go!
DerekH