A few of my sites are having dup content issues with BING. bingdude told me my robots.txt is not formatted correctly and that I need to set-up some rules that will establish that http://www.example.com/ is the canoncial and to ignore the https and non-www versions as well as any other added parameters for the entire site. I think it is important to note that I have lots of folders. I did some digging on here and online and can't find a specific answer/example for my issue. Can someone give me some input and/or point me in the right direction?
Thanks...