Forum Moderators: goodroi
Google is following those links and then going back to my other pages so when I do site:www.mypage.com i am getting duplicate pages , one http and one https version.
can i fix this using robots.txt and tell google to just not go to that order page, or should i just hardcode the http:// into all the links on the order page ot the other pages?
if i use robots.txt to dissallow is this the correct syntax?
User-agent: *
Disallow: /orderpage.com
thanks in advance for your help
What you may have to do is to have a dynamic robots.txt (ie. get your web server to parse in your scripting language or a URL rewrite) and server when the requesing the robots.txt with https:// to have it say
disallow: /
The other option is to reconfigure your webserver so that the https:// site points to a seperate folder and to put your orderpage in there with its own robots.txt.
But yes, do change the links in your order page as well to have the fully qualified URL with [....]