Welcome to WebmasterWorld Guest from 54.144.79.200

Forum Moderators: goodroi

Message Too Old, No Replies

Removing Pages - Does Robots.txt Help?

     

akreider

11:16 pm on Dec 15, 2006 (gmt 0)

10+ Year Member



If I'm trying to get Google to realize that a large number of pages do not exist, would adding them with Disallow to robots.txt increase or decrease the speed of this?

I have two goals. 1. To stop Google bot visiting the pages. 2. To remove them from the index.

phranque

2:39 am on Dec 16, 2006 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



1 - use robots.txt to prevent googlebot from going "there".
2 - use a http 410 or 301 response code as appropriate to permanently prevent future access to currently indexed pages - it's probably getting a 404 now which has indeterminate permanence.

jdMorgan

4:17 am on Dec 16, 2006 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



In regard to both of the previous posts, do step 2 first, then wait a few months and do step 1.

Point being, if you Disallow the pages in robots.txt, then the search engines will not request those pages and so will never see the 301-Moved Permanently or 410-Gone responses.

Jim