homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

Removing Pages - Does Robots.txt Help?

 11:16 pm on Dec 15, 2006 (gmt 0)

If I'm trying to get Google to realize that a large number of pages do not exist, would adding them with Disallow to robots.txt increase or decrease the speed of this?

I have two goals. 1. To stop Google bot visiting the pages. 2. To remove them from the index.



 2:39 am on Dec 16, 2006 (gmt 0)

1 - use robots.txt to prevent googlebot from going "there".
2 - use a http 410 or 301 response code as appropriate to permanently prevent future access to currently indexed pages - it's probably getting a 404 now which has indeterminate permanence.


 4:17 am on Dec 16, 2006 (gmt 0)

In regard to both of the previous posts, do step 2 first, then wait a few months and do step 1.

Point being, if you Disallow the pages in robots.txt, then the search engines will not request those pages and so will never see the 301-Moved Permanently or 410-Gone responses.


Global Options:
 top home search open messages active posts  

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved