Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Removing Pages - Does Robots.txt Help?

11:16 pm on Dec 15, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 1, 2004
votes: 0

If I'm trying to get Google to realize that a large number of pages do not exist, would adding them with Disallow to robots.txt increase or decrease the speed of this?

I have two goals. 1. To stop Google bot visiting the pages. 2. To remove them from the index.

2:39 am on Dec 16, 2006 (gmt 0)


WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
votes: 8

1 - use robots.txt to prevent googlebot from going "there".
2 - use a http 410 or 301 response code as appropriate to permanently prevent future access to currently indexed pages - it's probably getting a 404 now which has indeterminate permanence.
4:17 am on Dec 16, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
votes: 0

In regard to both of the previous posts, do step 2 first, then wait a few months and do step 1.

Point being, if you Disallow the pages in robots.txt, then the search engines will not request those pages and so will never see the 301-Moved Permanently or 410-Gone responses.



Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members