Welcome to WebmasterWorld Guest from 54.166.148.252

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Best/fastest way to remove a website from google's index

     
11:18 am on Nov 27, 2009 (gmt 0)

Full Member

10+ Year Member

joined:June 17, 2003
posts:208
votes: 0


what would be the best & fastest way to remove a website (+/- 50 pages) from googles index? We took the site down but left a blank index page with a no follow, cache & archive tag on it. but it seems to be taking a while for google to completely remove all the pages. wuold it be better using a noindex in the robot.txt file or leave all the pages up (blank of course) with a no index tag on them?
1:18 pm on Nov 27, 2009 (gmt 0)

Senior Member

joined:Jan 27, 2003
posts:2534
votes: 0


The quickest depends on how frequently Google revisits these pages. If you have a fast crawl/indexing cycle then I would return 410 gone for all the pages.

If it's going to be a long time for pages to be respidered then I would exclude in robots.txt and remove the pages using the URL removal tool in Webmaster Tools.

Finally, a blank page is not a good idea IMO - it may be interpreted as a temporary error and cause Google to keep the cached content for even longer.

6:27 pm on Nov 27, 2009 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member

joined:Nov 11, 2007
posts:769
votes: 1


You need a <meta name="robots" content="noindex"> in the <head> of each page if you want to prevent Google from showing it in the SERPs.

Using nofollow is practically worthless as a means of preventing them from indexing the page. All it takes is for one other site to link to the page with a followed link and BAM... it's back in the index.

Using a robots.txt disallow won't prevent Google from showing your page in the SERPs either if enough other sites link to it. While Google won't be able to crawl the page, they can still show a link to the page. Typically, it will appear with only a <title> which they infer from the various link texts pointing to the page and the URL. There won't be a snippet.

Once you have the <meta name="robots" content="noindex"> in place for each of the URLs then you can sign into Google's Webmaster Tools and request a URL removal. I think it generally takes a couple of business days. If it is an absolute emergency, you can request an emergency URL removal. You can request individual pages, folders, or the complete site (not applicable).

Just Google "google url removal request", "google immediate url removal request", "google emergency url removal request", etc.

9:02 pm on Nov 27, 2009 (gmt 0)

Moderator from AU 

WebmasterWorld Administrator anallawalla is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 3, 2003
posts:3728
votes: 9


By using a combination of robots.txt and WMT URL removal, Google removed over 2 million URLs within 3-4 hours (I wasn't checking every minute, but it could have been quicker). Quite impressive.
8:05 pm on Nov 28, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Sept 26, 2006
posts: 145
votes: 0


This video from Matt Cutts may help you, too.

How can I remove old content from Google's index?:
[youtube.com...]