Welcome to WebmasterWorld Guest from 50.17.74.162

Message Too Old, No Replies

Changed to HTTPS. Should I change WMT profile also?

     

Lightguy1

9:44 pm on Feb 2, 2009 (gmt 0)

5+ Year Member



Hello,

I changed my entire site over to HTTPS from HTTP. I have my HTTP site setup in G webmaster Tools. I have just fed that a new sitemap with all my https urls and it seems to be taking them.

Should I "add a new site" to my webmaster tool profiles specifying https?

Currently it is: http://www.example.com/sitemap

Is it worth deleting this site from my webmaster tools profile and adding a new site with the proper HTTPS ?

Thanks

[edited by: Robert_Charlton at 10:58 pm (utc) on Feb. 2, 2009]
[edit reason] changed to example.com - it can never be owned [/edit]

bwnbwn

1:25 am on Feb 3, 2009 (gmt 0)

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 5+ Year Member



yes and I would 301 all the old urls to the new urls as http and https are in effect 2 different urls same a http://example.com and http://www.example.com.

Robert Charlton

1:59 am on Feb 3, 2009 (gmt 0)

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



I'm wondering why you changed your entire site from http to https.

Lightguy1

4:06 pm on Feb 5, 2009 (gmt 0)

5+ Year Member



I have already 301 all the "old" URLS to the "new" URLS

I switched my site to HTTPS because I was having issues with my cart dropping items so why not give my customers security the whole time?

Once I add this new site to G webmaster tools, should I delete the old site(non https version) from my profile?

Thanks

bwnbwn

4:19 pm on Feb 5, 2009 (gmt 0)

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 5+ Year Member



yes

and I would also have my robots.txt file set up if google bot tries to call the http that your robits.txt file send it a dissalow all rule and https to allow all or block what folders you don't want spidered.

Lightguy1

4:50 pm on Feb 5, 2009 (gmt 0)

5+ Year Member



What do I set in my robots.txt file so google will not crawl HTTP and instead crawl https? I have the robots pointing to my https sitemap and google seems tobe indexing my new https urls.

I also have a 301 in place for my old http urls

bwnbwn

5:00 pm on Feb 5, 2009 (gmt 0)

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Yes but both the http and https can and will be indexed.
Is your site a statci site or dynamic site?

Lightguy1

5:48 pm on Feb 5, 2009 (gmt 0)

5+ Year Member



I have about 10 static pages and about 15,000 dynamic pages. However, I use a rewrite so the URLS appear static (removes all dynamic characters, make the link appear static) ISAPI rewrite

Do I need to create this new profile for my HTTPS website or can I keep my existing profile for HTTP?

Thanks

bwnbwn

6:26 pm on Feb 5, 2009 (gmt 0)

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 5+ Year Member



no just make a robots.aspx not sure what version of ISAPI u are using but you should be able to fix to your code. The below code sends the http code and disallows all https so all u need to do is reverse the code to match you needs.
This is for a microsoft server.

Add to the httpd.ini
RewriteRule ^/robots.txt$ /robots.aspx [NC,O]

make 2 robots files one robots.txt and one robots.aspx add the code below after you have changed the httd to dissalow all and the https to allow all or disallow what you dont wwant spidered.

<%If Request.ServerVariables("HTTPS") = "off" Then 'if not secure%>User-agent: *
Disallow: /cgi-bin/
Disallow: /cart/
<%
else
%>User-agent: *
Disallow: /
<%
end if
%>

be sure and test it as well
http://www.example.com robots.txt should show disallow all
[example.com...] should show to allow all or your folders you dont want spidered.

tedster

7:06 pm on Feb 5, 2009 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



so why not give my customers security the whole time?

One reason is that https is a lot slower - can be quite noticeable depending on the visitor's web connection and result in lost sales.

bwnbwn

7:34 pm on Feb 5, 2009 (gmt 0)

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 5+ Year Member



tedester I agree but he has already gone and got it indexed. I believe trying to switch back now would do more harm than good.

Lightguy1

9:16 pm on Feb 5, 2009 (gmt 0)

5+ Year Member



My site is not that friendly for Dialup customers anyways. I have alot of info per item, and my customers always want pictures and lots of specs. Hopefully the switch to HTTPS will not show a decrease in web sales. So far it has not.

I had a customer wants call and ask me why my whole site wasnt not secure because when he goes to his banking site everything is secure. Makes sense for a bank.

Robert Charlton

9:51 pm on Feb 5, 2009 (gmt 0)

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Security certificates are hostname specific. So canonicalization issues and subdomains will most likely trigger a dialogue box asking users whether they want to proceed. This can be inhibiting to users.

bwnbwn

9:52 pm on Feb 5, 2009 (gmt 0)

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Lightguy1
No problem just a couple points that you may not have know. You know your site and customers better than anybody. Just try to keep the http from getting indexed and you should be fine.

Lightguy1

2:54 pm on Feb 6, 2009 (gmt 0)

5+ Year Member



Is there a specific command I can place in my robots.txt to make it where http wont get indexed and https will.

I already have robots.txt points to my https sitemap, it seems google is also indexing my https. I have a 301 redirect in place for all old http urls

Robert Charlton

9:27 pm on Feb 6, 2009 (gmt 0)

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Is there a specific command I can place in my robots.txt to make it where http wont get indexed and https will.

In my experience, robots.txt is useless in fixing sitewide http/https canonical issues. The only fix I've seen work is mod_rewrite.

Lightguy1

9:36 pm on Feb 6, 2009 (gmt 0)

5+ Year Member



I have mod_rewrite in place. If ANY request comes into my site for HTTP it gives it a 301 redirect to HTTPS.

Do you guys think I need to add a new profile in webmaster tools for HTTPS or can I just continue to use my HTTP profile I have been using?

Thanks

tedster

2:31 am on Feb 7, 2009 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I agree with bwnbwn in the first reply above - you should do it.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month