Welcome to WebmasterWorld Guest from 184.108.40.206 , register , free tools , login , search , subscribe , help , library , announcements , recent posts , open posts Pubcon Website
Removing Pages - Does Robots.txt Help? akreider msg:3190074 11:16 pm on Dec 15, 2006 (gmt 0) If I'm trying to get Google to realize that a large number of pages do not exist, would adding them with Disallow to robots.txt increase or decrease the speed of this?
I have two goals. 1. To stop Google bot visiting the pages. 2. To remove them from the index.
phranque msg:3190199 2:39 am on Dec 16, 2006 (gmt 0)
1 - use robots.txt to prevent googlebot from going "there". 2 - use a http 410 or 301 response code as appropriate to permanently prevent future access to currently indexed pages - it's probably getting a 404 now which has indeterminate permanence. jdMorgan msg:3190243 4:17 am on Dec 16, 2006 (gmt 0)
In regard to both of the previous posts, do step 2 first, then wait a few months and do step 1.
Point being, if you Disallow the pages in robots.txt, then the search engines will not request those pages and so will never see the 301-Moved Permanently or 410-Gone responses.