Forum Moderators: open

Message Too Old, No Replies

What is the best way to close a subdomain ?

         

Peter_S

11:03 am on Apr 22, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



Hi,

I would like to close definitively one sub domain of my site. And I am not sure what's the best way to achieve it.

1- to 404 all requests, and add disallow robots in the robots.txt . The 404 page will be customized to inform "human" visitors the site (sub domain) no longer exists, and with a link to the main site home page and other sections.

2- to redirect (301) all requests to the front page of the sub domain, with the message that the site no longer exists, and with link to the main domain and other sections.

3- to redirect (301) all requests to the front page of the main domain.

4- something else.

Thank you,

not2easy

1:19 pm on Apr 22, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



4- something else: 410 all requests.

Definitely don't 301 to an unrelated page. That's a soft 404.

lucy24

5:08 pm on Apr 22, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I agree with not2easy: that's exactly what the 410 response is for. “This page used to exist, but I have intentionally removed it.” (And, as a bonus, it will make the Googlebot stop requesting the pages much faster than a 404 will.)

I'm especially puzzled over
to redirect (301) all requests to the front page of the sub domain
since you say you're shutting down the subdomain. If so, any response involving the same subdomain--the one you're shutting down--would seem counterproductive to say the least.

This question was posted in the HTML subforum. It might get more attention if it were moved to something like Webmaster General.

Peter_S

10:33 am on Apr 24, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



Thank you for your answers.

phranque

11:46 pm on Apr 24, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



and add disallow robots in the robots.txt

don't do this if you are expecting crawlers to see the 410 (or 404) response.

Peter_S

8:40 am on Apr 25, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



@phranque, yes, it sounds obvious now that you mentioned it :)

phranque

1:18 am on Apr 26, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



when you consider the search engine functions of crawling vs indexing it makes sense.

Peter_S

8:18 pm on Apr 27, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



I have an other question.

On my main domain, I would like to delete 20% of pages. I thought of returning 404/410 for these pages. However, is there a risk that it penalizes my "user friendliness" from a Google point of view? I read several times that having too many 404 pages can be seen by Google as negative, because it means people can reach pages which are no longer existing. So is there a smooth way to delete this pages?

There are no more internal links to these pages, and they are no more in my site links file.

But Googlebot keeps crawling them. So I was thinking, to continue to server these pages (200) but with a message saying the page no long exists (in case humans still visit from time to time), and to use the noindex, nofollow. Will this make Google to progressively get ride of them from its index? And then after some months to return 404. I don't know if my explanations make sense :)

lucy24

3:47 am on Apr 28, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Return a 410 right now and keep it in place forever. Yes, make a nice 410 page intended just for humans.

After a while the Googlebot will stop requesting the pages frequenly--but they never entirely stop, so you have to keep the 410 in place. At least until requests drop back to one or two a year.

phranque

4:13 am on Apr 28, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



However, is there a risk that it penalizes my "user friendliness" from a Google point of view? I read several times that having too many 404 pages can be seen by Google as negative, because it means people can reach pages which are no longer existing.

every site has (or should have) a potentially infinite number of urls that would return a 404.

the biggest worry i would have about googlebot getting 404 responses is if googlebot discovered those urls internally to your site or from a link on a relevant/authoritative site.

Peter_S

8:45 am on Apr 28, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



Thank you lucy24 and phranque.

the biggest worry i would have about googlebot getting 404 responses is if googlebot discovered those urls internally to your site or from a link on a relevant/authoritative site.

Yes, that was my concern. The reason I mentioned serving 200 response + "noindex" for a while, is that, I removed these pages from my internal links last week, but, I can't tell if Google already recorded this change. So Google might still have pages of my sites with links to these "future" deleted pagse, and might consider these as broken links.

lucy24

6:19 pm on Apr 28, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The main thing you have to do now is steel yourself to ignore all those non-error reports in GSC. “Yes, I know it’s missing, I removed it myself, and you should know it too since you saw the 410 response.”

Sure, if you’ve got links from people you’re in direct contact with you can ask them to remove the link. But it’s not a big issue; broken links are on them, not on you. And of course you’ll fine-tooth-comb your own site to make sure you have no remaining links to deleted content. (A text editor with multi-file search is your very, very good friend in this regard.)

Peter_S

11:32 am on Apr 29, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



Thank you all for your useful answers.