Forum Moderators: goodroi

Message Too Old, No Replies

Removing pages from Google's cache

Is it best to allow a "robots" "noarchive" crawl first?

         

Phil_Payne

2:05 am on Jan 10, 2006 (gmt 0)

10+ Year Member



I read the following in an old thread:

"You may want to try to use Google's url removal facility to remove those pages that they have already indexed - robots.txt standard does not require search engines to remove already cralwed pages from their index if such pages became disallowed at a later date."

So is it best to allow Googlebot to crawl once having set a robots metatag in the header to "noarchive" so as to flush Google's cache, and then set up the disallow?

I realise I could delete using the form, but there are quite a few pages and I'd like to automate the process.