| 1:25 am on Feb 3, 2009 (gmt 0)|
yes and I would 301 all the old urls to the new urls as http and https are in effect 2 different urls same a http://example.com and http://www.example.com.
| 1:59 am on Feb 3, 2009 (gmt 0)|
I'm wondering why you changed your entire site from http to https.
| 4:06 pm on Feb 5, 2009 (gmt 0)|
I have already 301 all the "old" URLS to the "new" URLS
I switched my site to HTTPS because I was having issues with my cart dropping items so why not give my customers security the whole time?
Once I add this new site to G webmaster tools, should I delete the old site(non https version) from my profile?
| 4:19 pm on Feb 5, 2009 (gmt 0)|
and I would also have my robots.txt file set up if google bot tries to call the http that your robits.txt file send it a dissalow all rule and https to allow all or block what folders you don't want spidered.
| 4:50 pm on Feb 5, 2009 (gmt 0)|
What do I set in my robots.txt file so google will not crawl HTTP and instead crawl https? I have the robots pointing to my https sitemap and google seems tobe indexing my new https urls.
I also have a 301 in place for my old http urls
| 5:00 pm on Feb 5, 2009 (gmt 0)|
Yes but both the http and https can and will be indexed.
Is your site a statci site or dynamic site?
| 5:48 pm on Feb 5, 2009 (gmt 0)|
I have about 10 static pages and about 15,000 dynamic pages. However, I use a rewrite so the URLS appear static (removes all dynamic characters, make the link appear static) ISAPI rewrite
Do I need to create this new profile for my HTTPS website or can I keep my existing profile for HTTP?
| 6:26 pm on Feb 5, 2009 (gmt 0)|
no just make a robots.aspx not sure what version of ISAPI u are using but you should be able to fix to your code. The below code sends the http code and disallows all https so all u need to do is reverse the code to match you needs.
This is for a microsoft server.
Add to the httpd.ini
RewriteRule ^/robots.txt$ /robots.aspx [NC,O]
make 2 robots files one robots.txt and one robots.aspx add the code below after you have changed the httd to dissalow all and the https to allow all or disallow what you dont wwant spidered.
<%If Request.ServerVariables("HTTPS") = "off" Then 'if not secure%>User-agent: *
be sure and test it as well
http://www.example.com robots.txt should show disallow all
https://www.example.com/robots.txt should show to allow all or your folders you dont want spidered.
| 7:06 pm on Feb 5, 2009 (gmt 0)|
|so why not give my customers security the whole time? |
One reason is that https is a lot slower - can be quite noticeable depending on the visitor's web connection and result in lost sales.
| 7:34 pm on Feb 5, 2009 (gmt 0)|
tedester I agree but he has already gone and got it indexed. I believe trying to switch back now would do more harm than good.
| 9:16 pm on Feb 5, 2009 (gmt 0)|
My site is not that friendly for Dialup customers anyways. I have alot of info per item, and my customers always want pictures and lots of specs. Hopefully the switch to HTTPS will not show a decrease in web sales. So far it has not.
I had a customer wants call and ask me why my whole site wasnt not secure because when he goes to his banking site everything is secure. Makes sense for a bank.
| 9:51 pm on Feb 5, 2009 (gmt 0)|
Security certificates are hostname specific. So canonicalization issues and subdomains will most likely trigger a dialogue box asking users whether they want to proceed. This can be inhibiting to users.
| 9:52 pm on Feb 5, 2009 (gmt 0)|
No problem just a couple points that you may not have know. You know your site and customers better than anybody. Just try to keep the http from getting indexed and you should be fine.
| 2:54 pm on Feb 6, 2009 (gmt 0)|
Is there a specific command I can place in my robots.txt to make it where http wont get indexed and https will.
I already have robots.txt points to my https sitemap, it seems google is also indexing my https. I have a 301 redirect in place for all old http urls
| 9:27 pm on Feb 6, 2009 (gmt 0)|
|Is there a specific command I can place in my robots.txt to make it where http wont get indexed and https will. |
In my experience, robots.txt is useless in fixing sitewide http/https canonical issues. The only fix I've seen work is mod_rewrite.
| 9:36 pm on Feb 6, 2009 (gmt 0)|
I have mod_rewrite in place. If ANY request comes into my site for HTTP it gives it a 301 redirect to HTTPS.
Do you guys think I need to add a new profile in webmaster tools for HTTPS or can I just continue to use my HTTP profile I have been using?
| 2:31 am on Feb 7, 2009 (gmt 0)|
I agree with bwnbwn in the first reply above - you should do it.