Welcome to WebmasterWorld Guest from 3.92.92.168

Forum Moderators: ocean10000

Message Too Old, No Replies

Please help with http/https problem

     
6:35 pm on Sep 27, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 14, 2003
posts:1783
votes: 0


Within the last couple of weeks, some non-secure pages I have thank rank very well on Google have now been indexed by Google using the https protocol. Where before the address www.example.com/widgets.html address appeared in Google's results, I now see [example.com...]

A couple of very well-respected members of this forum have said that I'm facing a potential duplicate content penalty from Google. In fact, I think I'm already seeing something to that effect.

I'm on an IIS shared server. The hosting company does not allow ISAPI to be installed for rewrites, because of past experiences with the rewrites affecting other sites.

One option is for me to switch to a VPS server where I can use the ISAPI rewrites, and do a 301 redirect on the pages in question so that they don't lose their ranking.

One tech person at the hosting company suggested I try the code below on my current server by creating a separate.aspx page in my directory:

if (Request.IsSecureConnection)
{
Response.Write("HTTPS");
}
else
{
Response.Write("HTTP");
}

I know so little about servers that I don't know exactly what the above code would do. Would all requests for https pages be redirected to http pages, or would it only redirect visitors whose browsers were not asking for a page with an SSL?

Assuming the latter, there's still the problem of the pages that Google now has ranking well under the https address. My fear is that a redirect like that will result in the pages disappearing from the search results.

Any suggestions or advice is very much appreciated.

8:00 pm on Sept 29, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3603
votes: 54


do you have a robots text file? Dou you have a rewrite rule added to the server? if you have them it is an easy fix
Make a robots text robots.txt now make another robots file but name it robots.aspx and add the code to the top of the page

<%If Request.ServerVariables("HTTPS") = "off" Then 'if not secure%>User-agent: *
Disallow: /make it what u want/
Disallow: /add anything here/
<%
else
%>User-agent: *
Disallow: /
<%
end if
%>

add what you want the non secure to be and no index all the https stuff.

This has worked for me for years. You can check after by typing in http://example.com/robots.txt
and
[example.com...]

One should allow what you want the other dissalow all

PS As fast as you can move the https out of your domain name to a sub domain name ie [secure.example.com...] this will keep your site from ever getting indexed under the https.

2:43 pm on Oct 6, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 14, 2003
posts:1783
votes: 0


Thanks for the reply, bwnbwn. I had my host install ISAPI on my server. I also went to one of the websites where you can ask for bids from a pro to do a particular job. I asked for bids to do exactly as you described. So far I have two bids, $250 and $350. One bidder said it would take 1-2 days.

Correct me if I'm wrong, but this sounds like it should be very straightforward. If I just write the robots.aspx page as you describe and upload it to the server, is that all that needs to be done? Or is there something in ISAPI that needs to be done as well?

I don't want to bleed you for free advice, but if you can point me to a how-to article, that would be much appreciated.

5:53 pm on Oct 6, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3603
votes: 54


dickbaker
You are not bleeding me at all be glad to help ya save 250 bucks that only takes a minute to do.
if you have the asp rewrite rule added to the server then just make the robots.aspx page put in the code.

Upload to the server ftp and then

Check to make sure robots.txt page is displaying what you want and then check by doing
[example.com...] this should throw a disallow all

<%If Request.ServerVariables("HTTPS") = "off" Then 'if not secure%>User-agent: *
Disallow: /make it what u want/
Disallow: /add anything here/
<%
else
%>User-agent: *
Disallow: /
<%
end if
%>

Add to the file what if anything you want disallowed in the robots.txt file and https will be disallow all.

Be sure anywere you have a link to the cart that is in https to make that link a "nofollow" as well.

Holler back if you have any issues or send me an email I will contact you of need be.

11:01 pm on Oct 8, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 14, 2003
posts:1783
votes: 0


bwnbwn, thanks for not considering this "bleeding."

I created an .aspx file without any LANGUAGE="VBSCRIPT or such at the beginning of the code (don't know if that was the right thing to do or not).

The file reads as follows:

<%If Request.ServerVariables("HTTPS") = "off" Then 'if not secure%>User-agent: *
Disallow:
<%
else
%>User-agent: *
Disallow: /
<%
end if
%>

I don't know if there should be anything else, but when going to [example.com...] I see User-agent:* disallow:/ on the page.

One of the tech support guys tried a robots simulator, but couldn't tell if the script was disallowing the bots from reading https files. He thought that perhaps there should be something written in the ISAPI file.

Is there a resource you're aware of for what's supposed to be set in ISAPI?

Thanks for indulging my ignorance.

1:13 pm on Oct 9, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3603
votes: 54


Don't do the [example.com...] but [myexample.com...] as this is what u need to make sure is working do both the non and the https in robots.txt and make sure they are correct.
then after checking put the urls in google robots.txt checker in webmasters area to test and u will see the bot is seeing the correct robots.txt file
10:33 pm on Oct 9, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 14, 2003
posts:1783
votes: 0


Thanks, bwnbwn. I did as you suggested, and webmaster tools shows Allow for all http pages and "not in domain" for the https protocols.

Now it's time to redirect the non-www URL.

12:45 pm on Oct 10, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3603
votes: 54


Good here is a good thread on doing the 301 non www to the www

Note! be aware if the secure socket is as well in the [mydomain.com...] it will cause and issue. This is one reason I got my ss out from under the domain name and put it under a subdomain.

[webmasterworld.com...]

u going to pubcon?

10:56 pm on Oct 10, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 14, 2003
posts:1783
votes: 0


Thanks again, bwnbwn. I hadn't thought about the https issue. My secure links are [example.com....] I'm going to have to ask the hosting company about a separate section for secure.example.com.

I'd love to go to Pubcon (or anywhere, for that matter), but the problems with the financial markets are affecting my sales, so that's a no-go for this year.