Forum Moderators: Robert Charlton & goodroi
[edited by: Andy_Langton at 9:52 am (utc) on May 27, 2016]
[edit reason] Please use example.com [/edit]
if it's doubling your page load times, something isn't properly configured
if it's doubling your page load times, something isn't properly configuredOptimization can be done, but its not getting done! Also it seems to turn out the GZIPPing encrypted pages makes the encryption easier to break! So the solution, turn off GZIP. I think it was an IEEE article about this.
The links below are navigable. Look to "SSL negotiation" violet bands.Actually a simple solution (for many) is to provide both https AND http for all pages, and let the visitor decide how slow they want their web experience to be. Several years back I used link rel=canonical https >> http for all pages that don't need encryption. Without this Google would index all pages twice, which hopefully they've fixed by now.
The first link, https: //www.google.com encrypted
[webpagetest.org...]
The second link, a comparison between the above and the link http ://www.google.com/?nord=1
For this comparison use the Opacity Slider provided to compare the differences.
[webpagetest.org...]
Please note that the un-ecrypted test still suffers the penalty of an encrypted transaction to apis.google.com. Google's home page would have loaded in 1.8 seconds without SSL (HTTPS) With SSL Google's home page loads in 2.9 seconds.
Of course Google benefits tremendously from web and browser caching which hides some of these inefficiencies.
The third link, to be complete, is the results with as little encryption as possible, similar to the results used in the comparison above.
[webpagetest.org...]
This shows a result that would have completed in 1.4 seconds without any encryption.
The most rudimentary test:
Two links http versus https from google for a png file of size 21.4K, The Google logos.
[webpagetest.org...] HTTPS
Notice "Bytes In" at 30K, but in the Request Details 21.6K
[webpagetest.org...] HTTP
Notice "Bytes In" at 22K, but in the Request Details the size remains 21.6K
The final test link:
[webpagetest.org...]
Shows what happens to users who type the shortest link possible to google, google.com
A 301 redirect to www.google.com, and then a 302 redirect to https:// www.google.com. I must admit I was surprised by that one!
After that first visit, things should fire right up.
Google Analytics calls this a 100% bounce rate! Will they ever fix this?
then of course they stopped passing keywords on to webmasters as well
Also it seems to turn out the GZIPPing encrypted pages makes the encryption easier to break
For an information site that fully answers the visitor's question there will be only one pageview, even if it's twenty minutes long. And oh, by the way, Google Analytics calls this a 100% bounce rate!
If that was the goal, they wouldn't provide us with extensive keyword data in the Search Console. They didn't have to do that.
Here are my load times .....In the other thread I've linked to, I spent a good bit of time using a "3rd party" test facility (webpagetest.org) for all my evaluations. These are not MY results. Obviously google.com changes from day to day. You appear to be reporting the number of bytes transferred after decryption. Your speeds are awesome, far from typical of the majority of internet users. But I have to ask, did you clear your cache? Please try an independent, respected, 3rd party, performance measuring authority.
Everybody calls it a bounce rate, because that's what it is. It's just a statistic, and time on site is another.
Bounce Rate is the percentage of single-page sessions (i.e. sessions in which the person left your site from the entrance page without interacting with the page).Note the word "interacting"; very important!
Losing keywords (and anything else) passed via GET parameters was just a byproduct of the switch to HTTPS. The benefits for users far outweigh the disadvantage to webmasters.Actually I wasn't concise, Google stopped reporting keywords to webmasters in their logs long before they implemented https. The change to https just stopped any competitors from recording keywords in requests to Google, and again the requests to Google, or the "questions" to Google are actually answered by webmasters like you and me. Surely you appreciate this. Google does not answer questions.
This is referred to as the BREACH attack. Certainly something to think about when you're handling sensitive data, but for most sites the benefits of compressed content will outweigh that risk.Good research (2012), and I agree, but also assume you are confusing compressed data with encrypted data. Compression is great, encryption is not needed in so many cases and degrades performance of the web to the world.... Compressed, encrypted data endangers encryption. So again needless encryption will degrade web performance, compressed encryption will degrade web security. Don't use it, if you think it's not needed.
If that was the goal, they wouldn't provide us with extensive keyword data in the Search Console.The keyword data reported in the Search Console has nothing to do with visitor search requests, this is simply Google's evaluation of a website's keywords.
"To protect user privacy, Search Analytics doesn't show all data. For example, we might not track some queries that are made a very small number of times or those that contain personal or sensitive information."For many websites this number "(not set)" approaches 60% of keyword data, a simple example being the word "doctor". Search analytics only shows some keywords.
The more important part is what happens after that handshake is completed, and lots of optimizations are already available for that, like session tickets, for example.Sorry the most important part is the first impression.
Apples and oranges
Since the Googlebot has obviously been to your site many times, it may not give an accurate sense of how long it takes for first-time visitors
But I have to ask, did you clear your cache? Please try an independent, respected, 3rd party, performance measuring authority.
Note the word "interacting"; very important!
Google's specification does not match their implementation. I've fixed this by adding a scrolling monitor event to my Analytics code; and I can guarantee you users ARE interacting with pages that Analytics would have reported, incorrectly, as bounces by Google's own definition!
Google stopped reporting keywords to webmasters in their logs long before they implemented https
The keyword data reported in the Search Console has nothing to do with visitor search requests, this is simply Google's evaluation of a website's keywords.
Sorry the most important part is the first impression.
Do i create a new property in search console? Do i delete the old http search console property?
Scrolling != Reading. Yes, some users who scroll will be reading your content. Others will scroll idly or use it to judge your page before leaving. Bounce rate is by definition a flawed statistic, but excluding everyone who scrolls from your bounce rate makes it even less reliable.Rob
If you don't believe me, try a user session recording service.
Here are my load times for Google.com over HTTPS and HTTP (your link):By the way, using webpagetest.org set to fiber mode, the only way the times you have reported seem reasonable is if you are only reporting the time to load the source of the page, NOT the entire page. AND BY THE WAY this test can no longer be done because Google has eliminated access to http altogether. If you truly got these times for Google's home page loading you must have had fiber between their cheeks!
HTTP: 251ms (101KB, 5 requests)
HTTPS: 243ms (100KB, 5 requests)