Welcome to WebmasterWorld Guest from 54.144.107.83

Forum Moderators: goodroi

Message Too Old, No Replies

Disallowing content previously indexed

What happens to indexed content subsequently blocked by robots.txt

     
8:28 am on May 9, 2007 (gmt 0)

New User

5+ Year Member

joined:Apr 27, 2007
posts:11
votes: 0


If you have certain pages which have been indexed, and subsequently you decide to block those pages via robots.txt, will search engines eventually drop those pages from the index? (Specifically Google)

I know I can manually remove URLs with the Google URL removal tool as long as they are blocked by robots.txt or return a 404.
It's stated that the pages returning 404 will be dropped eventually anyway. But I couldn't see what happens over time with pages which are just blocked in robots.txt if you do not manually remove the URLs.

10:08 am on May 9, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 22, 2002
posts:3455
votes: 0


I don't think it's reliable or rapid.

You may get better results with on-page instructions.

May also be worth removing that page entirely, and placing the content at a new easily-blockable URL

11:22 am on May 9, 2007 (gmt 0)

New User

5+ Year Member

joined:Apr 27, 2007
posts:11
votes: 0


Ah good idea.
I've vhanged the unwanted indexed items to return 404 and I've remake them in a way that'll be blocked by robots.txt.
Hopefully that'll sort it out eventually.
Cheers.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members