Welcome to WebmasterWorld Guest from 18.206.194.210

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Regarding HTTPS handling URL Variants and Sitemaps in Search Console

     
4:17 pm on Oct 5, 2017 (gmt 0)

New User

joined:Oct 5, 2017
posts: 2
votes: 0


I'm looking for enlightenment. But to feel less confused would be good too :-)

OK, I just switched a website to HTTPS.

Now, in Google Search Console, I already had the original website URL listed there, confirmed, and working nicely.

So, right after changing to HTTPS, I added the other 3 possible variations of the website URL, as several guides suggested to do so:

http:// (the original version of the site)
https :// (the new, and preferred main domain moving forward)
http ://www (not wanted, not used)
https ://www (not wanted, not used)

Aside: I think it odd that Google would want all types added?!

Right.

The next step was to submit my sitemap.xml to the HTTPS entry. Well, I did just that. And then I chose that one as my preferred main domain.

Here's where I get super confused..

I've read in some guides that I should also submit sitemaps for the other domains?

Perhaps with the old URLs listed?

This is either nonsense, or smart.

I don't know which though!

Secondly, if it is indeed SMART to do, I don't see how I CAN add sitemaps for other variants of my domain?

The server only hosts one version of my website, after all?..

If I link to http ://mysite.com/sitemap.xml then it will just 301 redirect to the new https location.. surely?

And I can't physically add a sitemap.xml file to a different URL version on my server..

So I clearly need help understanding this part.

Lastly, and maybe related to the above..

In Search Console, when I view the new HTTPS version of the site, I notice under "Sitemaps" it tells me:

29 URLs submitted
5 URLs indexed

Huh? Why only 5 out of 29?

It's been like this for 2 or 3 days now. Is this something that will resolve itself, or do I need to do something?

Any help on the above would be gratefully received :-)

Cheers,

Jordan

PS. I appear to have also been hit with whatever Google algo change that just happened, right beforeI did the HTTPS change... ugh

[edited by: goodroi at 5:11 pm (utc) on Oct 5, 2017]
[edit reason] Delinked examples [/edit]

9:29 pm on Oct 5, 2017 (gmt 0)

Full Member

10+ Year Member

joined:Feb 1, 2006
posts:271
votes: 2


Good questions,
I have similar experiences.
I am looking forward to answers
11:12 pm on Oct 5, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15932
votes: 887


I think it odd that Google would want all types added?!
Paradoxically, you have to register all forms in order to tell Google which ones you don't want to use. This is easier to understand in the case of with/without www, since it is physically possible for example.com and www.example.com to be controlled by different people, to live on different servers and to have different content. (Can anyone point to a legitimate site that actually does this? Probably not, but the possibility is there.)

There is really no point to having multiple sitemaps, since your ordinary domain-name-canonicalization redirect means that all requests will end up receiving the same file at the same URL. Sure, it is possible to exempt sitemap.xml from the universal redirect. But why would you? It's not like robots.txt, where you want to avoid all possible excuses for the visitor to say “I tried to read it ::whine:: but I just couldn’t get in”.
12:14 am on Oct 6, 2017 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11869
votes: 244


I think it odd that Google would want all types added?!

the purpose for creating all 4 properties is to look for the unexpected.
in most cases 3 of the 4 properties should be essentially devoid of data.
if unintended exceptions show up in one or more of the 3 non-canonical properties you might have some clues to a new or trending problem.
10:27 am on Oct 6, 2017 (gmt 0)

New User

joined:Oct 5, 2017
posts: 2
votes: 0


OK..

So I'll ignore the advice about adding sitemaps to the properties that I'm not using (which is good because I don't even know how I could.)

MORE important right now is this:

In Search Console, when I view the new HTTPS version of the site, I notice under "Sitemaps" it tells me:

29 URLs submitted
5 URLs indexed

Huh? Why only 5 out of 29?

It's been like this for 2 or 3 days now. Is this something that will resolve itself, or do I need to do something?

Cheers!
10:42 am on Oct 6, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


I've read in some guides that I should also submit sitemaps for the other domains?
No, that is not correct.

Only have the sitemap.xml for the HTTPS pages. Remove any old sitemap.xml file still using HTTP paths from your server.

However, there is no need to remove the old file from the old HTTP property in GSC. All that data will eventually disappear as the new data populates the new HTTPS property report.
2:58 pm on Oct 6, 2017 (gmt 0)

New User from EE 

joined:Dec 28, 2016
posts:19
votes: 3


You can definitely try adding the sitemap.xml in the GSC, but if your server is set up correctly, all these should redirect to the sitemap.xml of the preferred HTTPS version, if I'm not mistaken.