Forum Moderators: open

Message Too Old, No Replies

Getting a site removed from Google index

robots.txt and noindex don't work!

         

percentages

3:35 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am trying to get a domain name removed from the Google index. In early January I set robots.txt to ban Googlebot and set the meta tags for NOINDEX,NOFOLLOW. I validated the robots.txt file to make sure it was correct.

On www Google removed the snippet and cache copy, but the site and all pages are still in the index.

allinurl still shows all pages, even ones which were physically deleted 12 months ago?

A search for the company name still returns the domain name in the search results, probably due to backlinks, maybe because the domain name is the company name.

on www-fi Google is now showing cached copies for some pages again? I guess this is just very old data they dug up from somewhere?

Is there anyway to get Google to delete all information it is aware of related to a domain name without me attempting to get all backlinks removed? - That would be a next to impossible task to achieve.

jdMorgan

3:45 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



percentages,

Yes, use the meta robots noindex tag, and do not disallow Googlebot from fetching the pages which have that tag.

More details [webmasterworld.com] (see post #4).

HTH,
Jim

percentages

3:59 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks jdMorgan,

That is one interesting set of logic they are using. I always thought it would get the robots.txt file first and then go no further if it was * disallowed, and delete all current pages indexed.

I guess this means I have to put back the pages I have historically deleted, but are still indexed, in order to get them removed? What Fun!