Forum Moderators: coopster
My competitors all have a regular ssl (not a wildcard with a subdomain) but they are not getting https indexed in yahoo and google even though you can pull them up by typing a https prefix. They also do not have php code that disallows robots for https. How are they able to acheive that?
I am using PHP code.
I had heard that creating a subdomain for your secure pages is the preferred method but the wildcard ssl is 200 bones/yr vs 20 bones for a regular ssl. Can I just as effectively put in disallows for robots in the regular ssl pages and get the same ranking potential?
PHP can be used to do some redirection, but it is much easier at the HTTP server level, especially if you are using Apache.
You could use a separate subdomain just for SSL, but it many cases that is overkill. A separate directory on the same domain but listening on port 443 is often another strategy.