Depends on how often your site is deep crawled. Typically not long for a site with lots of links.
You can speed it up by going into Google Webmaster Tools and using the URL removal tool to request that they be removed. Then the <meta name="robots" content="noindex"> element should prevent them from being re-indexed.
[edited by: Robert_Charlton at 5:32 pm (utc) on Sep 1, 2013]
[edit reason] fixed typo at poster's request [/edit]
I would recommend using Google Webmaster Tools as well.
I just went through ranking hell by not using that first. Google Webmaster Tools definitely speeds the noindex big time.
|You can speed it up by going into Google Webmaster Tools and using the URL removal tool to request that they be removed. |
I thought those pages would have to either send a 404 / 410 header or be blocked by robots.txt to have the URL removal tool work, no?
It has been 4 days I marked it as NOINDEX, google does crawl daily however pages are still there in google index.
I will update you all
|I thought those pages would have to either send a 404 / 410 header or be blocked by robots.txt to have the URL removal tool work, no? |
No, not as I understand it.
The removal tool only removes a url for 90 days. If the page no longer exists after you remove it with the removal tool, you need to make sure that a 404/410 is sent before the 90 days is up to keep Google from re-including the url. I don't believe a 404 or 410 is necessary for the tool to work in the first place, though.
And, if the page will still exist after the 90 days, you can use robots.txt to keep Google from spidering the page. robots.txt, though, will not necessarily keep all references to the page out of the index.
You can use meta robots noindex (without robots.txt) to keep references to an existing page out of the visible index once the 90 days is up. Again, noindex should not be necessary for the tool to work.
And here's an excellent discussion on the distinction between indexing and crawling... not necessarily relevant to a discussion about the removal tool, but since I've mention robots.txt and noindex in the same discussion, someone will surely leap in to clarify, and we'll be doomed to repeat this discussion again ;) ...
Pages are indexed even after blocking in robots.txt
|It has been 4 days I marked it as NOINDEX, google does crawl daily however pages are still there in google index. |
This is too short time to check. You must give Google some time to update its index.
Is it a good idea to use the URL removal tool if you want the page to be noindex follow rather than noindex nofollow?
Also, there can be a delay of weeks between the crawl and the changes being reflected in the SERPS. I have added noindex tags to some pages and changed the text on others and it can take weeks for them to show, on a site that is fully crawled every two days or so.
I took me about 6 weeks for a directory of 1000s of pages on a pretty popular site, last I checked.
Single pages take about 2 weeks.
Using a NoIndex meta tag, and the webmaster tools URL removal tool gets rid of them in less than 24 hours for me.
But the URL removal tool seems to be limited to a few hundred URLs per day.
You do need to take the URLs out one at a time though - I tried taking whole directories out by using robots.txt and the URL removal tool - and the removals expired after three months, and the pages came back (albeit with no description due to robots.txt).