Forum Moderators: phranque

How to de-index a site?

         

leonwool

1:00 pm on Jan 20, 2026 (gmt 0)



Google SEO - Does anyone know a way to get de-listed from Google? Our site is not too highly rated but I'd prefer if it was completely inconspicuous We want to operate outside of big tech and more web ring style. We add #tagged links to anyone's site and try to provide a real search not a Glob Gob of if yo upay more you go up

[edited by: not2easy at 2:04 pm (utc) on Jan 20, 2026]
[edit reason] Moved from another thread [/edit]

not2easy

2:14 pm on Jan 20, 2026 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Hello leonwool and welcome to WebmasterWorld [webmasterworld.com]

I moved your thread to this forum because it was off-topic where it had been posted.

To de-index your site, simply add a noindex meta-tag to each page. If you have a GSC account and have submitted a sitemap, you can remove it and simply use robots.txt to tell Google not to crawl. You can disallow some other bots there as well, but rest assured your site will be crawled by all kinds of bots as many ignore robots.txt.

For your benefit, that welcome link above offers tips on using the forums' features and settings. To avoid future edits, it is helpful to take a few minutes to read through our ToS and Posting Guidelines. The staff here are unpaid volunteers and we prefer to avoid editing posts.

We add #tagged links to anyone's site and try to provide a real search not a Glob Gob of if yo upay more you go up
Please keep in mind that we are a community of webmasters and marketers. We discuss marketing but we are not here to actually do business with one another.

lucy24

6:17 pm on Jan 20, 2026 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



To de-index your site, simply add a noindex meta-tag to each page. If you have a GSC account and have submitted a sitemap, you can remove it and simply use robots.txt to tell Google not to crawl.
Pretty sure G### recognizes the
Header set X-Robots-Tag "noindex"
directive, which can be set once and for all in config/htaccess. (I don't know how to say it in IIS, but someone can help out if needed.)

Best approach probably is to start with the "noindex" and go a few months before switching to a comprehensive robots.txt Disallow. If they can't crawl the site, they can't see the noindex. (This is an awfully common blunder.) If you proceed directly to robots.txt, there will be search results saying something along the lines of “the site's robots.txt prevents us from indexing this page”. This may or may not matter to you.

blend27

1:04 pm on Apr 9, 2026 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>>> (I don't know how to say it in IIS, but someone can help out if needed.)
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="X-Robots-Tag" value="noindex,noarchive" />
</customHeaders>
</httpProtocol>
</system.webServer>
</configuration>