Welcome to WebmasterWorld Guest from 18.204.48.199

Forum Moderators: Robert Charlton & goodroi

410 pages still indexed in Google. How to eliminate them?

     
6:13 am on Oct 18, 2019 (gmt 0)

New User from IN 

joined:May 24, 2017
posts: 2
votes: 0


It's been 2-3 months implementing 410 http status code for some of the pages but they are still in the Google's index which ideally should have gone away at this time.

Some suggestions are coming from other webmaster to implement noindex,noarchive in x-robots tag. I feel, this will not work because pages are already 410 and bots will not be able to read noindex,noarchive tag.

Please share your suggestion if this will logically work or not.

Thanks in advance.
7:39 am on Oct 18, 2019 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11872
votes: 245


does GSC show anything for those urls?
have you tried the URL Inspection?
or have you checked your server logs to verify that googlebot has crawled those urls and seen the 410?
7:40 am on Oct 18, 2019 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11872
votes: 245


welcome to WebmasterWorld [webmasterworld.com], SEO_Lizard!
7:50 am on Oct 18, 2019 (gmt 0)

New User from IN 

joined:May 24, 2017
posts: 2
votes: 0


URL inspection says 404 on live testing and also checked server-log. Google-bot is hitting these 410 url. Even I doubled checked its linking it's nowhere on the page.

What's your suggestion? Would it be beneficial to implement X-robots-tag for noindex,noarchive in http response?
8:20 am on Oct 18, 2019 (gmt 0)

New User from IN 

joined:Oct 18, 2019
posts:2
votes: 0


All the pages already has been crawled and indexed. IF you put all the URLs in robots.txt then after sometime (When crawlers won't be able to crawl) pages will be gone one day. But the only problem is you will never know when that "one day" will be. That's how google works.
10:38 am on Oct 18, 2019 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3371
votes: 564


Welcome to WebmasterWorld Nitt_Hyman

pages will be gone one day. But the only problem is you will never know when that "one day" will be.

Correct, I have sites that have been closed for 3 years, their names are freely available for anyone to register, yet Google still ranks some pages and images from those sites. I actually re-bought one of those domain names and 301'd it to my main site yet Google still ranks the 3 year old pages and images!

I gave up worrying about this sort of thing years ago, if G cannot understand that a site no longer exists what chance is there for them getting anything else correct?
5:58 pm on Oct 18, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15934
votes: 889


URL inspection says 404 on live testing and also checked server-log. Google-bot is hitting these 410 url.
Well, which is it? 404 or 410?
4:34 am on Oct 21, 2019 (gmt 0)

New User from IN 

joined:Oct 18, 2019
posts:2
votes: 0


I was also working on my project and facing the same kind of issue. Because it's an e-commerce website and website structure has been changed 2-3 times, lots of urls already indexed but after sometime we changed the URL structure. Many URLs suffix of www.example.com is still present in google in bulk.
I'm also applying the same method to get rid of all old URLs.



[edited by: not2easy at 4:45 am (utc) on Oct 21, 2019]
[edit reason] anonymized domain/ToS [/edit]