Welcome to WebmasterWorld Guest from 3.233.226.151

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

https to http

     
9:48 am on May 27, 2016 (gmt 0)

New User

5+ Year Member

joined:Nov 6, 2013
posts:29
votes: 0


I asked my client to get a ssl certification, and so he did. But he has 301 redirected https://www.example.com to http://www.example.com.

1. Does this redirection may have any SEO related impact, considering Google prefers SSL certified protocol?

2. Shall I ask to revert the redirection, and put it from http to https? Note: implemented 301 redirection a couple of weeks ago.

3. I added and verified https version in the webmaster tool, and it is not showing any data. Can I add the other version of the website without re-verifying the property again?

[edited by: Andy_Langton at 9:52 am (utc) on May 27, 2016]
[edit reason] Please use example.com [/edit]

10:25 am on May 27, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


1. Google may prefer HTTPS over HTTP, but the impact on rankings is probably quite small at this time.
2. Mainly, you'll want to avoid going back and forth between the two. Pick one and stick to it, and HTTPS makes most sense for the future.
3. I believe so, yes. It's domain verification, so a change of protocol shouldn't require re-verification.
10:59 am on May 27, 2016 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:May 27, 2016
posts:68
votes: 22


https:// does jack sh1t for ranking. Confirmed
12:32 pm on May 27, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


That's opining, not confirming. I see no reason to doubt that it can make a difference in edge cases, and perhaps more so in certain categories of sites. But ultimately you should switch to HTTPS to secure your users' data, not for search engine rankings.
12:46 pm on May 27, 2016 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:May 27, 2016
posts: 68
votes: 22


https on static pages ? WTF ... oh well.. its not an opinion its facts.. opinion WTF
2:09 pm on May 27, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Facts are supported by evidence. Maybe if you threw in a couple more WTFs I'd be convinced.

Static pages can have forms, they can benefit from HTTP/2, and visitors can benefit from privacy. Pretty good reasons.

Back on topic, moving to HTTPS shouldn't impact your rankings negatively, so long as the technical implementation is correct.
6:11 pm on May 30, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


https can almost double page load time. This is true even for Google's homepage. So maybe you get a boost in rankings for https but get dinged for a slower pageload! Please actually measure your pageload time with and without https. Don't listen to those who say there is only a small degradation. It may be fixed in the future but for now it's slow.
Google is really slowing the web down by needlessly encouraging https. This change to https actually benefits Google the most, more so than their users.

For an analysis look at this thread:
[webmasterworld.com...]
Look at the results for Google's own home page!
7:44 pm on May 30, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


Corrected link to last post:
[webmasterworld.com...]
I hope!
8:26 pm on May 30, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


It really depends on your implementation; if it's doubling your page load times, something isn't properly configured. TLS can be optimized (session tickets, OCSP stapling, etc.) and is only going to get faster. Yes, there's the added handshake (for now), but things like HTTP/2 multiplexing can really speed up page load times (except for pages with very few resources). But absolutely: measure the difference. Then optimize and measure again. You'll need to drop some of the old tricks like domain sharding, of course.

I think encryption benefits users most of all, and don't see what's in it for Google, like you say.
8:31 pm on May 30, 2016 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12399
votes: 409


Yes, HTTP/2 should be the big motivator here.
8:34 pm on May 30, 2016 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15928
votes: 884


if it's doubling your page load times, something isn't properly configured

Isn't this another of those things like third-party fonts-- or even, to a lesser degree, external stylesheets-- where it varies enormously based on first page vs. subsequent pages? If the user has never been there before, the browser has to go look for the certificate before it can do anything. After that first visit, things should fire right up.
1:54 pm on June 3, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


if it's doubling your page load times, something isn't properly configured
Optimization can be done, but its not getting done! Also it seems to turn out the GZIPPing encrypted pages makes the encryption easier to break! So the solution, turn off GZIP. I think it was an IEEE article about this.

If you take a look at the link I provided in my previous post, encryption is increasing the pageload time of Google's home page, www.google.com (2.9 sec versus 1.4); surely Google has optimized servers. Boosts in encryption performance are basically still in the future.
And sorry apparently links embedded in quotes don't work at WW
[webmasterworld.com...]
The links below are navigable. Look to "SSL negotiation" violet bands.

The first link, https: //www.google.com encrypted
[webpagetest.org...]

The second link, a comparison between the above and the link http ://www.google.com/?nord=1
For this comparison use the Opacity Slider provided to compare the differences.
[webpagetest.org...]
Please note that the un-ecrypted test still suffers the penalty of an encrypted transaction to apis.google.com. Google's home page would have loaded in 1.8 seconds without SSL (HTTPS) With SSL Google's home page loads in 2.9 seconds.
Of course Google benefits tremendously from web and browser caching which hides some of these inefficiencies.

The third link, to be complete, is the results with as little encryption as possible, similar to the results used in the comparison above.
[webpagetest.org...]
This shows a result that would have completed in 1.4 seconds without any encryption.

The most rudimentary test:
Two links http versus https from google for a png file of size 21.4K, The Google logos.
[webpagetest.org...] HTTPS
Notice "Bytes In" at 30K, but in the Request Details 21.6K
[webpagetest.org...] HTTP
Notice "Bytes In" at 22K, but in the Request Details the size remains 21.6K

The final test link:
[webpagetest.org...]
Shows what happens to users who type the shortest link possible to google, google.com
A 301 redirect to www.google.com, and then a 302 redirect to https:// www.google.com. I must admit I was surprised by that one!
Actually a simple solution (for many) is to provide both https AND http for all pages, and let the visitor decide how slow they want their web experience to be. Several years back I used link rel=canonical https >> http for all pages that don't need encryption. Without this Google would index all pages twice, which hopefully they've fixed by now.

P.S. For non-encrypted Google home page, save some time and use:
www.google.com/?nord=1
[google.com...]

Finally Google jumped on encryption because it prevents all intermediaries (routers/traceroute) from parsing the keywords of the search requests. And then of course they stopped passing keywords on to webmasters as well; webmasters, the providers of the answers to the questions being asked!
2:06 pm on June 3, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


After that first visit, things should fire right up.

Don't they say the first impression counts the most?

For an information site that fully answers the visitor's question there will be only one pageview, even if it's twenty minutes long. And oh, by the way, Google Analytics calls this a 100% bounce rate! Will they ever fix this?
3:34 pm on June 3, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Here are my load times for Google.com over HTTPS and HTTP (your link):

HTTP: 251ms (101KB, 5 requests)
HTTPS: 243ms (100KB, 5 requests)

The HTTPS version is served over QUIC, though. The other via TCP. That's a complex optimization that's unavailable to us, unfortunately, but it works quite well obviously. The "problem" with Google.com is that it's a very simple webpage with few HTTP requests, so the initial TLS handshake has a relatively large impact. So if your point is that HTTPS is slower on the first request because there's the extra handshake, well... that's a given, and there's no way around that just yet. The more important part is what happens after that handshake is completed, and lots of optimizations are already available for that, like session tickets, for example.

Google Analytics calls this a 100% bounce rate! Will they ever fix this?

Everybody calls it a bounce rate, because that's what it is. It's just a statistic, and time on site is another.

then of course they stopped passing keywords on to webmasters as well

If that was the goal, they wouldn't provide us with extensive keyword data in the Search Console. They didn't have to do that. Losing keywords (and anything else) passed via GET parameters was just a byproduct of the switch to HTTPS. The benefits for users far outweigh the disadvantage to webmasters.

Also it seems to turn out the GZIPPing encrypted pages makes the encryption easier to break

This is referred to as the BREACH attack. Certainly something to think about when you're handling sensitive data, but for most sites the benefits of compressed content will outweigh that risk.
5:08 pm on June 3, 2016 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15928
votes: 884


For an information site that fully answers the visitor's question there will be only one pageview, even if it's twenty minutes long. And oh, by the way, Google Analytics calls this a 100% bounce rate!

I was thinking from the other side: Since the Googlebot has obviously been to your site many times, it may not give an accurate sense of how long it takes for first-time visitors. (Analogously, do they even realize how much time is added when a site uses GA, or a font or script that lives at Google? I know some sites where the browser's "loading up" icon never stops spinning, though fortunately it doesn't affect page display.)

<topic drift>
If that was the goal, they wouldn't provide us with extensive keyword data in the Search Console. They didn't have to do that.

Apples and oranges. It's one thing to look at raw logs-- or your own analytics-- and see which specific visits are attributable to which specific query, which in turn tells you which of those visitors went on to other pages. It's a far different thing to look in a search engine's Webmaster Tools-- a third-party venue that you have to join explicitly-- and see aggregated percentages. OK, so 1/3 of query A results in page clicks while 2/3 of query B does. But are those resultant page clicks all bounces (for better or for worse), or the beginning of lengthy, multi-page visits?
</td>

:: idly wondering, not for the first time, what "d" stands for, and why they couldn't say <tc> ::
7:33 pm on June 3, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


robz

Here are my load times .....
In the other thread I've linked to, I spent a good bit of time using a "3rd party" test facility (webpagetest.org) for all my evaluations. These are not MY results. Obviously google.com changes from day to day. You appear to be reporting the number of bytes transferred after decryption. Your speeds are awesome, far from typical of the majority of internet users. But I have to ask, did you clear your cache? Please try an independent, respected, 3rd party, performance measuring authority.

Everybody calls it a bounce rate, because that's what it is. It's just a statistic, and time on site is another.

Regarding Bounce Rate see Google's definition
[support.google.com...]
Google's definition
Bounce Rate is the percentage of single-page sessions (i.e. sessions in which the person left your site from the entrance page without interacting with the page).
Note the word "interacting"; very important!
Google's specification does not match their implementation. I've fixed this by adding a scrolling monitor event to my Analytics code; and I can guarantee you users ARE interacting with pages that Analytics would have reported, incorrectly, as bounces by Google's own definition! And actually, as a result, Analytics "time on page" is grossly inaccurate as well! If anyone is using these numbers for SEO, they are being mislead, especially those with information sites that answer a visitors question, COMPLETELY, with one pageview!

Losing keywords (and anything else) passed via GET parameters was just a byproduct of the switch to HTTPS. The benefits for users far outweigh the disadvantage to webmasters.
Actually I wasn't concise, Google stopped reporting keywords to webmasters in their logs long before they implemented https. The change to https just stopped any competitors from recording keywords in requests to Google, and again the requests to Google, or the "questions" to Google are actually answered by webmasters like you and me. Surely you appreciate this. Google does not answer questions.

This is referred to as the BREACH attack. Certainly something to think about when you're handling sensitive data, but for most sites the benefits of compressed content will outweigh that risk.
Good research (2012), and I agree, but also assume you are confusing compressed data with encrypted data. Compression is great, encryption is not needed in so many cases and degrades performance of the web to the world.... Compressed, encrypted data endangers encryption. So again needless encryption will degrade web performance, compressed encryption will degrade web security. Don't use it, if you think it's not needed.

If that was the goal, they wouldn't provide us with extensive keyword data in the Search Console.
The keyword data reported in the Search Console has nothing to do with visitor search requests, this is simply Google's evaluation of a website's keywords.
Quoting Google:
"To protect user privacy, Search Analytics doesn't show all data. For example, we might not track some queries that are made a very small number of times or those that contain personal or sensitive information."
For many websites this number "(not set)" approaches 60% of keyword data, a simple example being the word "doctor". Search analytics only shows some keywords.

The more important part is what happens after that handshake is completed, and lots of optimizations are already available for that, like session tickets, for example.
Sorry the most important part is the first impression.

The web just keeps getting slower and slower for me (DSL), and many others, and unnecessary https just makes it worse.
8:27 pm on June 3, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Apples and oranges

I hope I never suggested they were one and the same fruit. No, bumpski seemed to be suggesting Google had some sort of ulterior motive in taking away "our" keyword data, when that was just a byproduct -- collateral damage, if you like -- of their move to HTTPS-by-default for the benefit of their users. In fact, they went out of their way to give us back at least some of that data. Again, they didn't have to do that; we have no right to the data.

Since the Googlebot has obviously been to your site many times, it may not give an accurate sense of how long it takes for first-time visitors

Googlebot doesn't load pages in full, as far as I'm aware. It still crawls URLs individually. It can only measure the response times of the server for the requests it makes, which is what you see in the Search Console. I don't think it "benefits" from previous requests like it does users. Load time measurements probably come from real users around the world.
11:43 pm on June 3, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


But I have to ask, did you clear your cache? Please try an independent, respected, 3rd party, performance measuring authority.

I used Incognito Mode in Chrome and disabled the browser cache. I love Webpagetest, even used to host one of their test locations, but find that the results are not always accurate or representative of real user experiences, so I prefer to gauge performance based on Real User Monitoring (RUM), i.e. data provided by users' browsers.

Note the word "interacting"; very important!
Google's specification does not match their implementation. I've fixed this by adding a scrolling monitor event to my Analytics code; and I can guarantee you users ARE interacting with pages that Analytics would have reported, incorrectly, as bounces by Google's own definition!

It says "interacting with the page". Scrolling is an interaction with the web browser, not with the web page. You haven't "fixed" the Bounce Rate statistic, you've redefined it and adjusted the data accordingly. The "percentage of single-page sessions" is the traditional definition; the only interaction with the page that can be measured without fault is when another page with tracking is visited, so that's the default behavior. If your site works differently (or you disagree with the definition), they give you the tools to track events and indicate whether or not they represent an important engagement signal. What's not to like?

Google stopped reporting keywords to webmasters in their logs long before they implemented https

It's always had to do with HTTPS. They phased it in by first enabling HTTPS for all logged-in users, which meant we lost some keyword data, but it only affected a small percentage of searches, until they enabled it for everyone.

The keyword data reported in the Search Console has nothing to do with visitor search requests, this is simply Google's evaluation of a website's keywords.

No, it's real-world data (unless you're looking at the Content Keywords report), but some potentially sensitive searches are filtered. I've never seen "(not set)" in the Search Analytics reports, only in the Search Queries report in Google Analytics.

Sorry the most important part is the first impression.

Indeed, and I can tell you most of my pages load faster over HTTPS thanks to HTTP/2 (and SPDY previously). There's the extra TLS handshake at the start I was referring to, but then all of the page's resources are downloaded simultaneously on a single connection rather than 1-8 resources at a time, spread across multiple connections with TCP overhead. And subsequent request are even faster thanks to session tickets and other optimizations that minimize the overhead of HTTPS while retaining all the advantages of HTTP/2.
8:42 pm on July 19, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


It says "interacting with the page". Scrolling is an interaction with the web browser, not with the web page.
Sorry, this is just nonsense!
11:55 pm on July 19, 2016 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:June 26, 2004
posts:379
votes: 33


We just went all https this month. May not really matter for security but some customers love seeing that padlock / security symbol
7:24 am on July 20, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Sorry, this is just nonsense!

How so? What part of your page am I interacting with when I use the scrollbar?
7:40 pm on Aug 1, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


It's called reading!
8:12 pm on Aug 1, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Scrolling != Reading. Yes, some users who scroll will be reading your content. Others will scroll idly or use it to judge your page before leaving. Bounce rate is by definition a flawed statistic, but excluding everyone who scrolls from your bounce rate makes it even less reliable.

If you don't believe me, try a user session recording service.
2:15 pm on Aug 4, 2016 (gmt 0)

New User from GB 

Top Contributors Of The Month

joined:Aug 4, 2016
posts: 6
votes: 0


Whilst on the topic of http to https, can anyone shed light on the correct process involved with webmaster tools / search console? Do i create a new property in search console? Do i delete the old http search console property? If i keep the old one, what do i do with the sitemaps section - do i upload the https sitemaps?
2:24 pm on Aug 4, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Do i create a new property in search console? Do i delete the old http search console property?

That's what I always do. If you redirect from HTTP to HTTPS, your HTTP backlinks will also show (as redirects) in the backlinks report for HTTPS. If you don't redirect for some reason, then the old property might be worth hanging on to.
8:25 pm on Aug 14, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 13, 2004
posts:833
votes: 12


Scrolling != Reading. Yes, some users who scroll will be reading your content. Others will scroll idly or use it to judge your page before leaving. Bounce rate is by definition a flawed statistic, but excluding everyone who scrolls from your bounce rate makes it even less reliable.

If you don't believe me, try a user session recording service.
Rob

You have no idea how poor a job Google Analytics does reporting "time on page" with its default code. I've documented it in the Analytics thread (and received a few thank you's, directly, once people TAKE the TIME to experiment!). One thousand visitors can visit the same page on a site, each spending 5, 10, 15 minutes on that single page, and the time on page will be reported as ZERO until the webmaster gets lucky and some soul (# 1001) wanders to a second page! That is how misleading Google Analytics is to information sites that provide the highest quality pages, pages that answer the user's question completely, with a single page visit! The typical webmaster is making poor decisions based on these terrible statistics. And yes, Analytics is Google's premier website analysis tool.
AND https used by the typical webmaster is very slow, even Google can't get it right.
Now I'm sending messages to jeff@amazon.com, complaining how slow their site has become and how difficult it is to shop there especially now that they have migrated to https. So far placation in response, from Jeff's minions.

SCROLLING through a page is certainly interacting with it. Come ON!

Here are my load times for Google.com over HTTPS and HTTP (your link):
HTTP: 251ms (101KB, 5 requests)
HTTPS: 243ms (100KB, 5 requests)
By the way, using webpagetest.org set to fiber mode, the only way the times you have reported seem reasonable is if you are only reporting the time to load the source of the page, NOT the entire page. AND BY THE WAY this test can no longer be done because Google has eliminated access to http altogether. If you truly got these times for Google's home page loading you must have had fiber between their cheeks!
Please stop encouraging the average webmaster to convert to https exclusively, your slowing the web! Support both and give your visitor a voice!