Welcome to WebmasterWorld Guest from 54.145.117.127

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Changed to HTTPS. Should I change WMT profile also?

     
9:44 pm on Feb 2, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 31, 2008
posts: 114
votes: 0


Hello,

I changed my entire site over to HTTPS from HTTP. I have my HTTP site setup in G webmaster Tools. I have just fed that a new sitemap with all my https urls and it seems to be taking them.

Should I "add a new site" to my webmaster tool profiles specifying https?

Currently it is: http://www.example.com/sitemap

Is it worth deleting this site from my webmaster tools profile and adding a new site with the proper HTTPS ?

Thanks

[edited by: Robert_Charlton at 10:58 pm (utc) on Feb. 2, 2009]
[edit reason] changed to example.com - it can never be owned [/edit]

1:25 am on Feb 3, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3496
votes: 6


yes and I would 301 all the old urls to the new urls as http and https are in effect 2 different urls same a http://example.com and http://www.example.com.
1:59 am on Feb 3, 2009 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11440
votes: 202


I'm wondering why you changed your entire site from http to https.
4:06 pm on Feb 5, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 31, 2008
posts:114
votes: 0


I have already 301 all the "old" URLS to the "new" URLS

I switched my site to HTTPS because I was having issues with my cart dropping items so why not give my customers security the whole time?

Once I add this new site to G webmaster tools, should I delete the old site(non https version) from my profile?

Thanks

4:19 pm on Feb 5, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3496
votes: 6


yes

and I would also have my robots.txt file set up if google bot tries to call the http that your robits.txt file send it a dissalow all rule and https to allow all or block what folders you don't want spidered.

4:50 pm on Feb 5, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 31, 2008
posts:114
votes: 0


What do I set in my robots.txt file so google will not crawl HTTP and instead crawl https? I have the robots pointing to my https sitemap and google seems tobe indexing my new https urls.

I also have a 301 in place for my old http urls

5:00 pm on Feb 5, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3496
votes: 6


Yes but both the http and https can and will be indexed.
Is your site a statci site or dynamic site?
5:48 pm on Feb 5, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 31, 2008
posts:114
votes: 0


I have about 10 static pages and about 15,000 dynamic pages. However, I use a rewrite so the URLS appear static (removes all dynamic characters, make the link appear static) ISAPI rewrite

Do I need to create this new profile for my HTTPS website or can I keep my existing profile for HTTP?

Thanks

6:26 pm on Feb 5, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3496
votes: 6


no just make a robots.aspx not sure what version of ISAPI u are using but you should be able to fix to your code. The below code sends the http code and disallows all https so all u need to do is reverse the code to match you needs.
This is for a microsoft server.

Add to the httpd.ini
RewriteRule ^/robots.txt$ /robots.aspx [NC,O]

make 2 robots files one robots.txt and one robots.aspx add the code below after you have changed the httd to dissalow all and the https to allow all or disallow what you dont wwant spidered.

<%If Request.ServerVariables("HTTPS") = "off" Then 'if not secure%>User-agent: *
Disallow: /cgi-bin/
Disallow: /cart/
<%
else
%>User-agent: *
Disallow: /
<%
end if
%>

be sure and test it as well
http://www.example.com robots.txt should show disallow all
[example.com...] should show to allow all or your folders you dont want spidered.

7:06 pm on Feb 5, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


so why not give my customers security the whole time?

One reason is that https is a lot slower - can be quite noticeable depending on the visitor's web connection and result in lost sales.

7:34 pm on Feb 5, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3496
votes: 6


tedester I agree but he has already gone and got it indexed. I believe trying to switch back now would do more harm than good.
9:16 pm on Feb 5, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 31, 2008
posts:114
votes: 0


My site is not that friendly for Dialup customers anyways. I have alot of info per item, and my customers always want pictures and lots of specs. Hopefully the switch to HTTPS will not show a decrease in web sales. So far it has not.

I had a customer wants call and ask me why my whole site wasnt not secure because when he goes to his banking site everything is secure. Makes sense for a bank.

9:51 pm on Feb 5, 2009 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11440
votes: 202


Security certificates are hostname specific. So canonicalization issues and subdomains will most likely trigger a dialogue box asking users whether they want to proceed. This can be inhibiting to users.
9:52 pm on Feb 5, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3496
votes: 6


Lightguy1
No problem just a couple points that you may not have know. You know your site and customers better than anybody. Just try to keep the http from getting indexed and you should be fine.
2:54 pm on Feb 6, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 31, 2008
posts:114
votes: 0


Is there a specific command I can place in my robots.txt to make it where http wont get indexed and https will.

I already have robots.txt points to my https sitemap, it seems google is also indexing my https. I have a 301 redirect in place for all old http urls

9:27 pm on Feb 6, 2009 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11440
votes: 202


Is there a specific command I can place in my robots.txt to make it where http wont get indexed and https will.

In my experience, robots.txt is useless in fixing sitewide http/https canonical issues. The only fix I've seen work is mod_rewrite.

9:36 pm on Feb 6, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 31, 2008
posts:114
votes: 0


I have mod_rewrite in place. If ANY request comes into my site for HTTP it gives it a 301 redirect to HTTPS.

Do you guys think I need to add a new profile in webmaster tools for HTTPS or can I just continue to use my HTTP profile I have been using?

Thanks

2:31 am on Feb 7, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


I agree with bwnbwn in the first reply above - you should do it.