Forum Moderators: goodroi

Message Too Old, No Replies

Coverage: Submitted URL marked ‘noindex’

         

seemasharma

12:24 pm on Dec 23, 2021 (gmt 0)



Hello,
I have a client website. Website is new but all pages have index, follow tag. But While I am send a manual indexing request by search console its showing an error" Coverage
Submitted URL marked ‘noindex’. What is the reason of this error.
(No Links Please)


[edited by: not2easy at 1:27 pm (utc) on Dec 23, 2021]
[edit reason] Please see ToS [webmasterworld.com] [/edit]

not2easy

2:06 pm on Dec 23, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Hello seemasharma and Welcome to WebmasterWorld [webmasterworld.com]

Because "index, follow" is the default for Google it means that even if there is no robots meta tag in the page header that would be their understanding for the page. You may need to look at other things that might signal 'noindex' on the site.

If it is a WordPress site, there is a sitewide noindex (actually a robots.txt directive preventing crawl) that might have been checked off during development of the site. That happens with enough frequency that some plugins will alert you to the problem.

Of course, we don't know that it is a WP site so you might have the same problem caused by a mistake in your robots.txt file. I would check there first if you are certain about the meta tag syntax.

lucy24

4:16 pm on Dec 23, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If it is a WordPress site, there is a sitewide noindex (actually a robots.txt directive preventing crawl) that might have been checked off during development of the site.
I think we can exclude this possibility, because if the page had not been crawled, the search engine could not have seen the “noindex” directive.

Another possibility is that there is a directive somewhere in config/htaccess
Header set X-Robots-Tag "noindex"
that applies to the page in question.

If this is Google we are talking about--as suggested by the reference to “search console”--another very real possibility is that GSC is delusional.

phranque

9:10 pm on Dec 23, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



try crawling your site with a tool such as Screaming Frog SEO Spider and see if indicates what the issue is.

NickMNS

9:54 pm on Dec 23, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




Another possibility is that there is a directive somewhere in config/htaccess
Header set X-Robots-Tag "noindex"
that applies to the page in question.

Go to the page using dev-tools in your browser, using the "network" tab, then check the response headers, it will show all the headers transmitted with the request, if X-Robots-Tag is used it will show it there.

phranque

3:41 am on Dec 24, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



if X-Robots-Tag is used it will show it there

you can also get this information from a Screaming Frog crawl - for all the pages crawled.