Msg#: 3840672 posted 4:19 pm on Feb 5, 2009 (gmt 0)
and I would also have my robots.txt file set up if google bot tries to call the http that your robits.txt file send it a dissalow all rule and https to allow all or block what folders you don't want spidered.
Msg#: 3840672 posted 6:26 pm on Feb 5, 2009 (gmt 0)
no just make a robots.aspx not sure what version of ISAPI u are using but you should be able to fix to your code. The below code sends the http code and disallows all https so all u need to do is reverse the code to match you needs. This is for a microsoft server.
Add to the httpd.ini RewriteRule ^/robots.txt$ /robots.aspx [NC,O]
make 2 robots files one robots.txt and one robots.aspx add the code below after you have changed the httd to dissalow all and the https to allow all or disallow what you dont wwant spidered.
<%If Request.ServerVariables("HTTPS") = "off" Then 'if not secure%>User-agent: * Disallow: /cgi-bin/ Disallow: /cart/ <% else %>User-agent: * Disallow: / <% end if %>
be sure and test it as well http://www.example.com robots.txt should show disallow all https://www.example.com/robots.txt should show to allow all or your folders you dont want spidered.
Msg#: 3840672 posted 9:16 pm on Feb 5, 2009 (gmt 0)
My site is not that friendly for Dialup customers anyways. I have alot of info per item, and my customers always want pictures and lots of specs. Hopefully the switch to HTTPS will not show a decrease in web sales. So far it has not.
I had a customer wants call and ask me why my whole site wasnt not secure because when he goes to his banking site everything is secure. Makes sense for a bank.
Msg#: 3840672 posted 9:51 pm on Feb 5, 2009 (gmt 0)
Security certificates are hostname specific. So canonicalization issues and subdomains will most likely trigger a dialogue box asking users whether they want to proceed. This can be inhibiting to users.