Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Share Your Opinion on Https

         

smilie

10:55 pm on May 3, 2017 (gmt 0)



It is especially true for small businesses and personal sites and blogs.

Have you wondered if https will be good for your site?

The answer is NO.

Remember all those bad links you were disavowing? Well guess what, with https these are NEW. Newly discovered "bad" links. That Google will show as recently found in GWT. Except, of course, you don't know which ones are good and which aren't , and have to guess and they may change their mind tomorrow or 5 years from now. But it still feeds Penguin.

Time has come to discuss this and shame Google into no longer adding negative weights to links.

cabbie

5:24 am on May 6, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The 301 redirect code
RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^ [%{HTTP_HOST}%{REQUEST_URI}...] [L,R=301]

does not redirect from non www. to
https://www.

I think


[edited by: not2easy at 6:05 am (utc) on May 6, 2017]
[edit reason] readability [/edit]

not2easy

6:37 am on May 6, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



You're right, cabbie, the %{HTTP_HOST} will proceed with either http or https, whichever was requested. If you want to use code similar to that rule keyplyr posted above and also rewrite to www it is:
RewriteEngine On
# Redirect ANY HTTP request to https and www
RewriteCond %{SERVER_PORT} =80 [OR]
RewriteCond %{HTTP_HOST} !^(www\.example\.com)?$
RewriteRule (.*) https://www.example.com/$1 [R=301,L]

Note that you should always test for pages in subdirectories as rewrite rules are commonly not inherited. You may need to add (or edit) an htaccess file in each folder where you serve pages. Try to visit example.com/folder/page.html - or whatever format your URLs have - to be sure they are all correctly formatted.


devGirl

8:58 pm on May 6, 2017 (gmt 0)

10+ Year Member



So if you've 301 redirected all http:// to https:// do you still have to register http:// in Google Search Console?

not2easy

10:32 pm on May 6, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Yes, you should have both versions. If you are changing an existing domain that has been listed in your GSC, this can help you check to be sure that all pages are now being indexed as https. You can check the old "Indexed" graph, then look at the https version. You can see the old site showing fewer and fewer indexed pages and the new https account showing a corresponding increase. If you remove the old http: version, you can miss seeing signals that there may be issues to take care of.

It is not some requirement, but it makes sense to have it.

iamlost

4:23 pm on May 7, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



IF your site has a form of any sort requiring user input/verification or offers a download of any type it is not only a best practice to connect at least all such pages via HTTPS but it may well be a legal requirement such that not doing so opens one to various liabilities and provides grounds for insurance providers to walk. Remember that in our interconnected world it is not just the jurisdictions(s) where you live but where your server(s) reside, etc. Competent qualified legal advice should be sought for a site's specific situation, one size does not fit all.

IF your site is basic web served pages without other interaction you still may want to consider:
* browsers are increasingly highlighting variously whether a connection is HTTPS or not.
--- increasingly, visitors see the 'padlock' 'as a symbol of trust.

* the benefits of serving via HTTP/2 instead of HTTP/1.1.

Regardless, it is a business decision.
Note: switching merely for a SE ranking boost is not only the absolute last reason to do so it isn't one at all.

smilie

8:51 pm on May 8, 2017 (gmt 0)



>>@keyplyr, That is not accurate. If you have properly installed the 301 redirect

You are not following me. Not YOUR site links. External links, which is what Penguin is about.

Doesn't matter how correct your redirects are.

>>@tangor: The PROTOCOL has nothing to do with link juice or keywords.

It's grean when Google lovers come out and make statements like this.

So. Tell me, are these identical or different links, from these two pages:
http://www.example.com/page1.html
https://www.example.com/page1.html

Since https is completely different set of pages vs. http, and since are two completely different URLs per Google - these are two different links.

So. If your page was down penalized for a "blue widget" keyword because Google god found a few bad EXTERNAL pages that are linking to it that they did not like, say there's 10 "bad" (from G standpoint) pages linking to it. Now, with all these sites switching to https these doubled to 20.

Because ,
http://www.example.com/page1.html
https://www.example.com/page1.html
are two different URLs.

tangor

8:56 pm on May 8, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So. Tell me, are these identical or different links, from these two pages:

If the redirect is set up correctly there is only ONE url that works. If you are hoping to have cake and eat it, too, please let me know how to do that.

The OBJECT of going to https is to get away from HTTP.

And I'm pretty sure my posting history does not indicate a love of g. :)

I think this is mountain and mole hill, trying to find evil where it does not exist. There's not much you can do about external links except disallow them and that has nothing to do with the PROTCOL.

smilie

9:01 pm on May 8, 2017 (gmt 0)



But that's not how it works in the real life.

In the real life, Google already recorded once since Penguin launch that there's a bad link pointing from http://www.example.com/page1.html to your site. That's -1.

So now that everyone's switching to https, Google Penguin is recording second time, that there's a bad link from https://www.example.com/page1.html . That's -2.

The result is messing up a lot of smaller sites even more.

Now, if G admitted such and TURNED NEGATIVE LINK WEIGHTS OFF.
That would make sense as far as this: "The OBJECT of going to https is to get away from HTTP."

But the objective is not that, the resulting objective is to shake up small sites even more.

tangor

9:06 pm on May 8, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



By your own reckoning these are two different pages. Each starts at zero until run through the algo. They would, therefore, be independent and score independently, not additively. You'd be no worse off than before, other than the fact that the site page is now given a plus factor for being HTTPS.

smilie

9:10 pm on May 8, 2017 (gmt 0)



Well, not from 0.

The Penguin has been running for years.

So, let's say your site is mysite.com/bluewidgets1.html , with main KW being "blue widgets".

Google Penguin already recorded once since Penguin launch that there's a bad link pointing to it from
http://www.example.com/page1.html to your site.
That's -1.

So, as soon as THAT SITE (not yours) switched to https, Penguin is recording a second negative link, from
https://www.example.com/page1.html to your site.
That's -2.

As other sites switch to https, this creates massive doubling of negative links, and an additional roller-coaster for sites with weak link profiles.

jmccormac

9:19 pm on May 8, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Currently working on measuring the use of HTTPS in approximately 1100 TLDs. It is not as common as people think and not every webmaster has drunk the Google Koolaid.

Regards...jmcc

NickMNS

9:41 pm on May 8, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



not every webmaster has drunk the Google Koolaid.

I did today and it was delicious. (I just converted my site to https:)

Now let's see how bad the hanger over is going to be?

jmccormac

10:03 pm on May 8, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I did today and it was delicious. (I just converted my site to https:)

Now let's see how bad the hanger over is going to be?
Hope your site doesn't get Jim Jonesed. :)

One long established ccTLD shows approx 4.13% HTTPS redirects.

Regards...jmcc

keyplyr

11:01 pm on May 8, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@smilie
You are not following me. Not YOUR site links. External links, which is what Penguin is about. Doesn't matter how correct your redirects are.
I understood what you were saying and it isn't accurate. Switching to HTTPS does not affect your backlinks. It does not affect your external backlinks. It does not affect those backlinks that may have switched to HTTPS either if you properly compiled your disavow links file without using protocols.

Penguin has nothing to do with switching to HTTPS.

You seem intent on finding something wrong with HTTPS and there just isn't any reason not to switch... and *every* reason to do so.

tangor

11:30 pm on May 8, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



o, as soon as THAT SITE (not yours) switched to https, Penguin is recording a second negative link, from
https://www.example.com/page1.html to your site.
That's -2.


You're still making an assumption that has no basis. Once a site goes HTTPS g moves to that and does not use the HTTP side any longer. It is still "one" bad link no matter how you count it.

If that wasn't the case then all the horror stories of trying to go back to HTTP after switching to HTTPS and losing ranking wouldn't be out there. G might REMEMBER old urls but it does not USE them once a site changes the underlying protocol. Nothing else makes sense.

smilie

7:19 pm on May 23, 2017 (gmt 0)



@tangor, could it's possibly be two different databases?

Or maybe 10.

I bet main algo, Penguin, Panda, and maybe even their magic AI are separate databases.

Since I've built my first DB oh maybe in 1996? I'll tell ya, there's no way , no how these are 1) updated simultaneously, and even 2) updated all anyway

(some are kept old for historical reasons, you need to know how many links your pages had before in case you just got 10K extra links, to penalize you)

keyplyr

8:47 pm on May 23, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I bet main algo, Penguin, Panda, and maybe even their magic AI are separate databases.
These are algorithm updates.

HTTPS is a protocol not a database. The browser negotiates a link with the server using a protocol (a language of communication.) Google indexes the protocol that is supported with that negotiation and returns that in the SERP in the form of a link, or path to your site.

Once your site has switched to HTTPS, since the old path is 301 (Moved Permanently) it will no longer exist in Google's index. There are no "different databases."

Read how the big sites are all moving to HTTPS: [webmasterworld.com...]

lucy24

12:18 am on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



:: waiting for head to stop spinning ::

I think smilie's position is this:

If evilsite links to http://example.com, that's one count against you.

If, at some time in the future, example.com changes to https and redirects all http requests, then that original toxic link counts twice: once because all the existing information for http://example.com is transferred straight across to https://example.com, and once more because the toxic link, redirected, counts against the new https://example.com in its own right.

Uhm ... That's postulating a pretty dimwitted algorithm, isn't it? One that simultaneously knows and doesn't know that http://example.com and https://example.com are the same site. You can say a lot of things about G###, but they are not often accused of systemic stupidity.

tangor

3:09 am on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Nailed. It. Precisely.

tangor

3:11 am on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Not for the faint of heart.

Get rid of all toxic links (and any others, too) when you switch to HTTPS don't redirect anything, just make the HTTP side disappear (410)

Brand new start and no down checks anywhere!

keyplyr

3:21 am on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



when you switch to HTTPS don't redirect anything, just make the HTTP side disappear (410
Bad advice.... you'd lose traffic from all existing backlinks and a lot more.

Just follow what Google suggests and use the 301 Moved Permanently.

Again...

- Generic Steps to Switch from HTTP to HTTPS -


• Read all info at your host concerning certificates & switching to HTTPS and when applicable, follow those instructions.

• Install security certificate.

• Have you host enable HTTPS (if needed.) This will enable access from both HTTP & HTTPS.

• Go through site, page by page & make sure all file paths are relative (no protocol.) Test by accessing site using HTTPS and look for any browser alerts.

• Install 301 code in .htaccess file
RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
Note: your server may require a different code

• Go through site again, page by page, and test. Any remote absolute links will need to be HTTPS including those found in scripts & pluggins. If you publish Adsence or other advertising, links in these scripts need to be HTTPS also (or just remove the protocol altogether.)

• Update sitemap.xml (if applicable) and submit to appropriate agencies (Google, Bing, Yandex, etc)

• In Google Search Council create a new site using HTTPS (do not use the Change of Address form.) It will take a few days to start populating information. This is normal & traffic to old site (HTTP) will drop off accordingly.

• Bing Webmaster Tools, Yandex & others should update automatically once they crawl your new pages. Updating/re-submitting sitemap.xml should speed up this process.

tangor

4:04 am on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



keyplry, apparently humor and satire are not in your vocabulary.

Everything I've said is go HTTPS, redirect, and make sure all internal links are also pristine in that regard. Even wikipedia agrees:

The 301 redirect is considered a best practice for upgrading users from HTTP to HTTPS.[1] RFC 2616 states that:

If a client has link-editing capabilities, it should update all references to the Request URL.
The response is cachable.[2]
Unless the request method was HEAD, the entity should contain a small hypertext note with a hyperlink to the new URL(s).
If the 301 status code is received in response to a request of any type other than GET or HEAD, the client must ask the user before redirecting.

[en.wikipedia.org...]

The world is not black and white, except for links and goofy fears and desperate link juice and lions and tigers and bears, oh, my!

Changing protocol? Use the 301 redirect, get it done, and carry on with whatever baggage was on the HTTP site, too. That means any bad links (won't get dinged twice, just keeping the old baggage) and all the other loverly (good links and that's old baggage, too!)

robzilla

11:06 am on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



keyplry, apparently humor and satire are not in your vocabulary.

Could it be the execution? You have a diverse audience here, and not everyone's going to catch your drift. You don't want people taking advice you meant in jest, so perhaps make it a little more obvious.

That's postulating a pretty dimwitted algorithm, isn't it?

Unfortunately, many people are stuck in a mindset where search engines still work like they did in the 90s.

not2easy

2:44 pm on May 24, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



We all make booboos. At least I do. When 3 popular pages tanked after the change to https I suddenly remembered that those were the oldest pages on the site, which had originated as .html static pages. To carry my 301s during a hosting change in 2009 I'd stuck in php header 301's - years ago. I was reminded by their continuing clicks on the old HTTP version in GSC. Within 48 hours it was remedied.

Test, test, test is not enough. Test again. Then test some more - or keep better notes. Point being that if things happen after the switch, things that shouldn't happen - look for the reason. It's not duplicated bad links.

smilie

8:29 pm on May 24, 2017 (gmt 0)



I am now trying to speak very slowly. Guys, try to keep up, ok?

@keyplyr and @lucy24.

>>HTTPS is a protocol not a database.
HTTPS is a protocol.

Google is a list of DATABASES. Google returns results from a LIST of databases.

>> That's postulating a pretty dimwitted algorithm, isn't it? One that simultaneously knows and doesn't know that http://example.com and https://example.com are the same site.

Google is a LIST of databases. Each is updated at different times and with DIFFERENT results.

You are going to have the MAIN database. Which the MAIN algorithm is running against. That collects MAIN pages and links.

Then you are going to have to have Panda database.
and Penguin database.
and AI database
and whatever.

Some of these are going to be immediately updated when Google crawler hits your new https site (or spammer's new https site where link to you is negatively weighted).

Some won't be IMMEDIATELY updated.

Some won't be EVER updated (for instance historical link counts and maybe KW density).

keyplyr

8:48 pm on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



No smilie, that's not how indexing works.

robzilla

10:44 pm on May 24, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google is a list of DATABASES

And the internet is a series of tubes.

smilie

1:32 pm on May 25, 2017 (gmt 0)



And how does "indexing" work?

Ok, let's define terms. I want you guys to write your own definition here, how's that , step up to the plate.

Which part of the SE group of processes that are creating a keyword-based (database or flat-file which is also a basic db) index AND a LIST (ok how about Group?) of supporting DATABASES do you refer to when you say "indexing"?

Are you including just URL in "indexing", or full copy of content, or processed copy of content, or keyword index based on the URL, list of backlinks for the URL, or anything else, or all of the above. And if all of the above, where is this magic "all of the above" is magically stored then. Don't tell me flat files or "cache" or "Big Table" or Hadoop because I'm sorry to tell you, that's a basic Database. And, in fact, more than one.

And just because there's several algos of rearranging keys in the database based on keywords and 200 other factors, and just because it is split across 1000 machines doesn't change the fact that it is a database.

jmccormac

2:49 pm on May 25, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google is not a "LIST" of databases. You sound like someone with a limited experience based on toy databases on Microsoft Access trying to bluff about federated databases and Search. Most SEOs and webdevs have never built a search engine, never designed any search algorithms and have a limited understanding of running search engines. This a thread about HTTPS.

Regards...jmcc

smilie

2:54 pm on May 25, 2017 (gmt 0)



Edit.
@jmccormac, I do run a few, my live db has several tables with over million rows , live on a website gets hit 6 times per page (page returning in a quarter of a second live) and has been running like that since early 2000-th. It's SQL Server. I don't keep too much adding to it and truncate data.

I also was on a team developing a few data warehouses back in the day, one for a company if I'll name you'll know. Not billions of rows, but a good one. And I see my wife work with a Hadoop daily with billions upon billions of rows (largest DB in her industry, which is huuuuge). So I have a slight idea. Just because there's new terminology every 5 years and there are people "in the know" who learned it, it doesn't change the underlined hardware stuff and basic principles of software development on top of it.

DB key is still a key and if you don't hit this type of DB via a key you are never going to get results, that's the essence of the difference.

This is a good representation of how Google works. What is this yellowish thingy, that's drawn just like all other databases are?
[media.licdn.com...]
"no sql key value store" = DB. << maybe this is news to some that "key value store is a database"?

But instead of discussing my qualifications let's discuss either yours , or the topic of Https. Actually I do dislike how the topic is named because I did not want originally to discuss https protocol, which is what you guys are trying to swing the discussion to. But rather the fact that ****https links are going to be different from http links and momentarily create another negative SEO pressure on a lot of websites****.

<< this was the original topic of discussion. Not the https protocol. Instead, there are ad hominem and strawman attacks and questioning of "my" qualifications instead of sticking to the topic.
This 87 message thread spans 3 pages: 87