homepage Welcome to WebmasterWorld Guest from 107.22.37.143
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Visit PubCon.com
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
Removing Pages - Does Robots.txt Help?
akreider




msg:3190074
 11:16 pm on Dec 15, 2006 (gmt 0)

If I'm trying to get Google to realize that a large number of pages do not exist, would adding them with Disallow to robots.txt increase or decrease the speed of this?

I have two goals. 1. To stop Google bot visiting the pages. 2. To remove them from the index.

 

phranque




msg:3190199
 2:39 am on Dec 16, 2006 (gmt 0)

1 - use robots.txt to prevent googlebot from going "there".
2 - use a http 410 or 301 response code as appropriate to permanently prevent future access to currently indexed pages - it's probably getting a 404 now which has indeterminate permanence.

jdMorgan




msg:3190243
 4:17 am on Dec 16, 2006 (gmt 0)

In regard to both of the previous posts, do step 2 first, then wait a few months and do step 1.

Point being, if you Disallow the pages in robots.txt, then the search engines will not request those pages and so will never see the 301-Moved Permanently or 410-Gone responses.

Jim

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved