Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Disallowing content previously indexed

What happens to indexed content subsequently blocked by robots.txt

8:28 am on May 9, 2007 (gmt 0)

5+ Year Member

If you have certain pages which have been indexed, and subsequently you decide to block those pages via robots.txt, will search engines eventually drop those pages from the index? (Specifically Google)

I know I can manually remove URLs with the Google URL removal tool as long as they are blocked by robots.txt or return a 404.
It's stated that the pages returning 404 will be dropped eventually anyway. But I couldn't see what happens over time with pages which are just blocked in robots.txt if you do not manually remove the URLs.

10:08 am on May 9, 2007 (gmt 0)

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member

I don't think it's reliable or rapid.

You may get better results with on-page instructions.

May also be worth removing that page entirely, and placing the content at a new easily-blockable URL

11:22 am on May 9, 2007 (gmt 0)

5+ Year Member

Ah good idea.
I've vhanged the unwanted indexed items to return 404 and I've remake them in a way that'll be blocked by robots.txt.
Hopefully that'll sort it out eventually.

Featured Threads

Hot Threads This Week

Hot Threads This Month