Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

410 pages still indexed in Google. How to eliminate them?

         

SEO_Lizard

6:13 am on Oct 18, 2019 (gmt 0)

5+ Year Member



It's been 2-3 months implementing 410 http status code for some of the pages but they are still in the Google's index which ideally should have gone away at this time.

Some suggestions are coming from other webmaster to implement noindex,noarchive in x-robots tag. I feel, this will not work because pages are already 410 and bots will not be able to read noindex,noarchive tag.

Please share your suggestion if this will logically work or not.

Thanks in advance.

phranque

7:39 am on Oct 18, 2019 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



does GSC show anything for those urls?
have you tried the URL Inspection?
or have you checked your server logs to verify that googlebot has crawled those urls and seen the 410?

phranque

7:40 am on Oct 18, 2019 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



welcome to WebmasterWorld [webmasterworld.com], SEO_Lizard!

SEO_Lizard

7:50 am on Oct 18, 2019 (gmt 0)

5+ Year Member



URL inspection says 404 on live testing and also checked server-log. Google-bot is hitting these 410 url. Even I doubled checked its linking it's nowhere on the page.

What's your suggestion? Would it be beneficial to implement X-robots-tag for noindex,noarchive in http response?

Nitt_Hyman

8:20 am on Oct 18, 2019 (gmt 0)

5+ Year Member



All the pages already has been crawled and indexed. IF you put all the URLs in robots.txt then after sometime (When crawlers won't be able to crawl) pages will be gone one day. But the only problem is you will never know when that "one day" will be. That's how google works.

RedBar

10:38 am on Oct 18, 2019 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Welcome to WebmasterWorld Nitt_Hyman

pages will be gone one day. But the only problem is you will never know when that "one day" will be.

Correct, I have sites that have been closed for 3 years, their names are freely available for anyone to register, yet Google still ranks some pages and images from those sites. I actually re-bought one of those domain names and 301'd it to my main site yet Google still ranks the 3 year old pages and images!

I gave up worrying about this sort of thing years ago, if G cannot understand that a site no longer exists what chance is there for them getting anything else correct?

lucy24

5:58 pm on Oct 18, 2019 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



URL inspection says 404 on live testing and also checked server-log. Google-bot is hitting these 410 url.
Well, which is it? 404 or 410?

Nitt_Hyman

4:34 am on Oct 21, 2019 (gmt 0)

5+ Year Member



I was also working on my project and facing the same kind of issue. Because it's an e-commerce website and website structure has been changed 2-3 times, lots of urls already indexed but after sometime we changed the URL structure. Many URLs suffix of www.example.com is still present in google in bulk.
I'm also applying the same method to get rid of all old URLs.



[edited by: not2easy at 4:45 am (utc) on Oct 21, 2019]
[edit reason] anonymized domain/ToS [/edit]