Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

The obsession with Page Speed

         

GoogSay

9:38 pm on Oct 11, 2020 (gmt 0)

5+ Year Member



I thought I would write this as a warning to other webmasters. I like many of you have used Tools for page speed like pagespeed insights, lighthouse, pingdom and horrified by the scores decided something must be done. So I installed plugins that cached, deferred js, moved css to the footer, combined files, minimalized and gziped files. My score went up significantly. However I started noticing 5xx errors in GSC and my site became horribly responsive for uncached queries. I felt like giving up, I needed plugins to get my pagespeed passing but the same plugins were causing the site to fail. I was in catch 22 I need speed to rank on Google and the same plugins caused the site to fail in GSC.

So I looked into it, I mean how important a factor is it. To my surprise it turns out for ranking not to be a big deal. Gary Illyes from Google actually tweeted about it saying "Ranking wise it's a teeny tiny factor, very similar to https ranking boost.". Additionally I learned from Martin Splitts (Google) video on page speed that Google only determine if a site is very slow of not. So I stripped out all the plugins and went back to wp at its core. No more errors and a reasonable load time for all queries.

Yes I would ideally like the page faster, but its a new site, shared hosting and few visitors at present. As the site hopefully gets popular it could fund a dedicated server and bespoke speed optimization, but first I need to learn to walk.

iamlost

2:09 am on Oct 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That’s a neat sidestep from power generation to optimised sites...

However, what is really already happening, has been happening for over a decade, is that data centres and web hosts have been going green aka increasingly switching to renewable energy sources with utility mains increasingly seen as backup rather than primary.
Note: the article mentions this in passing at the end as it is not it’s raison d'être.

Of course the driver for this change is less an environmental focus than a financial one; approximately half the operating cost of a data centre is electricity, ROI is the rationale, green is marketing at no additional cost.

Nice if contrived hook.

Have another:
Breaking news!
Every day each person on earth creates 500 litres of carbon emissions per day (by merely breathing out)!

NickMNS

3:09 am on Oct 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




Breaking news!
Every day each person on earth creates 500 litres of carbon emissions per day (by merely breathing out)!

First off let me say, that personally I "create" far more that 500 litres, but that is because I run, bike and do other endurance sports.

Breathing doesn't contribute to global warming as the oxygen you consume and the carbon you exhale are continuously cycled, basically through the food you eat. You exhale, the plants convert the carbon to oxygen, you eat the plants as food. You then consume the oxygen and in turn create the carbon. It is a relatively stable loop.

On the other hand when you dig up tons of carbon from the ground and burn it, this adds carbon that didn't previously exist to the system.

So If I understood the claim correctly, make your website faster will reduce your carbon foot print? I'm not sure, what if you parallelize all you process, basically running more CPU's for shorter time. Your faster but not more energy efficient.

ronin

4:04 pm on Oct 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I managed to get my website under 1 sec for mobile and under 0.5 seconds for desktop, based on field tests. It is a sh*t load of work to get at that point.


I'd agree that under 0.5s is pretty impressive.

Similarly, in early 2016, unhappy with what was generally available, I started developing my own custom-built CMS. I made a huge leap forward in 2019 when I succeeded (after an abortive attempt in early 2018) in building a super-fast custom-built server-side module system.

My target page-load speed on desktop is 0.25s. While it's still rare that I achieve that (at this stage of development) I also rarely find that any single page takes longer than 0.33s to load.

Note that (at this stage) these are dynamic pages, loading (server-side) modules dynamically once the page-request has been sent to the server.

Before the end of this year I intend to move to Stage 2 (long-planned) in which it will be possible for a user to turn any dynamically loaded page (or the entire site) static which (I'm anticipating) will make everything even faster.

The final stage (which I already wrote in late 2018) is the Service Worker which uses the Cache API to turn any number of pages (or the entire site) into a Progressive Web App which works offline, without needing to contact a remote server or undertake any round-trips.

explorador

8:52 pm on Oct 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The following comment requires attention, common sense and being practical (and reasonable) to fully understand it's meaning, otherwise random people will underestimate it:

    Many websites make no sense and lack planning, because they only serve the same text and couple of images per page over and over (it could be cached or absolute static), and still... somehow they manage to make it weight tons and take forever to load. It's... nonsensical.

Perhaps in the future we will see some online tools testing speed and load in order to "tell you how green your site is".

CountXero

12:45 am on Oct 27, 2020 (gmt 0)

5+ Year Member



Back before the world fell down, Gary Illyes and I were on a panel together in Sydney, Australia. When quizzed about page speed and how much it mattered, Gary said, “It definitely matters, but we usually only start dinging you if your website is slower than CNN.com.”

True to form, if you run PageSpeed Insights on CNN.com, sometimes is will time out on mobile.

As with most signals, it’s one of many and given the multiplicative properties of the algorithm, any one element can drag you down, but like the poster stated, don’t obsess over it in the name of the greater good.

jetteroheller

11:32 am on Oct 28, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have top scores for all my sites.
I have written my own CMS 1997 and improved it again and again.
My CSS is built in the html and size optimized.
My CMS had conditional compiling for javascript.
I can activate code for each domain separat,
sure asynchron loaded and only one js.
One graphic for all button and logos.
htm and js compressed delivered

JorgeV

12:08 pm on Oct 28, 2020 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Good @jetteroheller, I am the same, but I guess we are old dinosaurs, not a lot of web dev are minding this today.

evansstroud

4:11 pm on Nov 3, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



I've been obsessing over speed since last two months. Tried both free and paid plugins, turn out they all suck. They speed up your site momentarily and a day later, back to Square one. So I let it be. I'm working on my content and if it works, good. I know now site speed won't affect the rankings so why even bother.

Thank you for the post.

explorador

11:09 pm on Nov 5, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Obsessing is not a good thing, taking it serious or planning for the future on the other hand is very positive (and depending on your situation, a pain, like now, exactly what's happening).

tangor: My page speed is ... what it is. I don't obsess [...] basic rule of thumb is K.I.S.S

The value of this exceeds the looks of it.

The web is growing (can't exactly say evolving, that's me), but all that variety of tools come with a price, some things are easy at the cost of complex functionality, speed or loads on the server. The thing is... you don't know until you know, or you don't notice until it is too late. On some cases the complexity of NOT keeping it simple (KISS) can turn things into an unneeded challenge of work, or the difference that can make you drop the towel, when you say "I have enough of this". No kidding OP, people get hired specifically to make sites faster, sites other people created (and are a nightmare).

jetteroheller: I have top scores for all my sites.
I have written my own CMS 1997 and improved it again and again.

It's amazing the value of this above. There are many tools, but the specific ones are always the best because they address exactly the needs and features.

Many times random people end up asking why their sites never do better, and many times they refuse to understand the value of optimization. Optimization is good for everyone.

Enacker

9:10 pm on Nov 11, 2020 (gmt 0)

5+ Year Member



Speed is great to create a positive user experience. In the long term their is talk of user experience playing a role in the alogirthm, and thats what we should focus on.

Another important fact to look at is competition speed. Look a the top 5 results for major competitors and compete against them.

I would make sure my user metrics are improving as i increase speed and usability. Dont forget its not just speed, but what loads to the user first. Frist content painful and all that jazz makes for a better user experience versus plain speed.

I have plenty of sites ranking great without optimizing speed like other sites.

Enacker

9:11 pm on Nov 11, 2020 (gmt 0)

5+ Year Member



Don't use only plugins,.hire an actual developer to code the site for speed. I keep hearing plugins plugins, but that only gets you so far with wordpress.

tangor

1:25 am on Nov 12, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Enacker ... I'd like to personally welcome you to Webmasterworld!

WP, of course, is everywhere these days, but there's still a large part of the net that is static and non-database or "automated."

Speed = clean code. Avoiding flashing lights and all-dancing gadgets. Even faster sites don't use g ads (not a joke, just truth, but counterproductive to the thread!). Finding the best metric between third part inserts on a site is where speed ACTUALLY ARRIVES. Finding that balance is the "magic bullet" and each needs to find that for THEIR SITE.

jacobjack

9:53 am on Dec 2, 2020 (gmt 0)

5+ Year Member



Showing the content through images, GIFs & videos will make the user experience good. But at the same time it is not possible to achieve good page speed score with this content. Should we compromise on the content just for the speed?

jetteroheller

12:27 pm on Dec 2, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@jacobjack Sure not, but look for fast showing text. 2 kB loaded, text shows already on the screen and does not jump around at further page load.

explorador

3:18 pm on Dec 2, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Ads are not mentioned here but it's worth talking about it. If your site has any advertising, check that out, in many cases the advertising are mostly the ones hurting your speed and even standards (validation). Google demands speed but fails to deliver in this sense (advertising). A common practice used by some (me included), is to remove advertising from the sites from time to time, or do so while your site grows, then consider adding the advertising.

Iamlost: Have another:
Breaking news!
Every day each person on earth creates 500 litres of carbon emissions per day (by merely breathing out)!

And many of them are not decent human beings with decent thoughts, the world could be better off without them. Same as with websites producing a lot of heat and carbon signatures in the environment, worse considering the amount of boths consuming resources.

ronin

5:22 pm on Dec 2, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google demands speed but fails to deliver


I don't know what things are like with Google Plugins now, in late 2020, but nearly a decade ago, I lost faith in Google Analytics because I could see how much it was slowing down the site I was running at the time.

A few years later I dropped Google AdSense for the same reason (amongst other reasons).

jetteroheller

5:47 pm on Dec 2, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ronin I let Google AdSense only on my old outdated sites. All new sites
GZ compressed with integrated CSS
Sprite for all common graphics to have only one access
Conditional compiled GZ compressed javascript loaded asynchron.
JS calls on server pl with all the data about the visitor, even battery status.
That is enough for tracking user behavior and what hardware they use.
So I do not need a "We use cookies, You have to agree" for EU visitors.
User agent, screen size and other parameters are on small sites enough to track

Robert Charlton

1:54 am on Dec 3, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Showing the content through images, GIFs & videos will make the user experience good. But at the same time it is not possible to achieve good page speed score with this content. Should we compromise on the content just for the speed?

jacobjack, that's a question I remember thinking about when Google starting encouraging both more images and faster pages. Google of course understands the problem... speed is a central concern of Google... but Google also knew that images and video were immensely important to the web experience.

Google prodded site developers and technology companies in both directions, first with Panda and load speed, knowing that loading of large images and videos would need to be refined with mobile and responsive, including eventually http2, 5G, etc.

Read the second post on this thread, in my Oct 12 post, where I talk about what Matt Cutts described as 'outliers". Google knows that in some areas of the world, the technology may not be up to speed (pun intended), so Google only downranks sites when they're much slower than the general range of competitors in these areas). Again, just the outliers. As the overall technology speeds up, you may have to keep up with things. You might say that speed is a work in progress.

Eg, as Google has gone mobile-first, instead of simply reducing image filesize, you may need to create "responsive images", along with properly coded CSS, for different devices and display sizes.

Additionally, as jetteroheller notes, Google is realizing that, given the present state of the art, pages that jump around as they load are a major annoyance, and Google has grouped several factors together along with speed, and calling them "Core Web Vitals" (CWV).

These, like speed, are a work in progress, and they will evolve over time. Just keep an eye on things, and don't fall way behind.

I don't believe that "layout shift", which is what jetteroheller describes, is even operative yet, but when it is, my guess is that eventually Google may start to comparing shifting pages, like slow loading pages, perhaps once more looking for outliers. There's speculation on how much layout shift might eventually affect ranking, as I'll note later.

First, here's the CWV announcement thread...

Google "Core Web Vitals" replace 'Speed Report' in GSC
May 28, 2020 - July 2020
https://www.webmasterworld.com/google/4997092.htm [webmasterworld.com]

In that post, I quote Google's summary of the three Core Web Vitals....


Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.

First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.

Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.


Note also this SER article, reporting Gary Illyes' thoughts on CWV as a ranking factor. I think this we probably necessary because the word "core" in CWV may have over-emphasized these user experience factors. Relevance ultimately is most important.

Google: It's Unlikely Core Web Vitals Will Become The Primary Ranking Factor
Sep 7, 2020 - by Barry Schwartz
[seroundtable.com...]

Gary Illyes from Google said on Reddit he thinks it is unlikely that core web vitals "would ever become the primary factor for organic traffic." He said you shouldn't ignore it but Google and other search engines rank primarily based on "highest quality and most relevant results for users' queries," not necessarily what is in the core web vitals.


tangor

4:55 am on Dec 3, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Content first, pretty second, with pretty getting an enormous weight which has not been revealed by g in values or metrics one can seek to achieve.

While there are parts of the world were net speeds are primitive or just a notch higher, the vast part is working with relatively good speed---page loading, even with third party (eg. adsense), begins to display/provide content in the one to two eye blinks that is so essential to having a happy user.

The trend to pictures and video is merely an acceptance that human beings are now trained that a "screen" has pictures and video ... and ... also human ... an innate desire to seek the path of least insistence (brain reading/thinking) for visual reactions. After all, that is ALSO human nature!

Finding that balance --- AND PAGE SPEED --- is the challenge.

csdude55

7:12 am on Dec 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I know I'm late to the game, but I wanted to throw in my $0.02 because speed has been my main focus for awhile, too.

But unlike the OP, my purpose isn't for SEO. What I've found is that my pages per session are directly proportionate to the load time. It's as if each user has subconsciously decided how much time they're willing to spend on my site, so it's up to ME on how many pages they see in that amount of time.

This whole year I've been doing little tweaks here and there that shave 100ms here, 200ms there... and I've cut my average page load time from around 7s to just over 1s (excluding Adsense, which is totally out of my control). In response, Analytics shows that my pages per session have increased 158.45%. That's significant, especially during a year when RPM went waaaaaaay down!

It's also notable that it's only partly true that "the vast part is working with relatively good speed". There are a lot of variables that are easily forgotten:

1. I'm in the US, and there's a neighboring county of about 12,000 people that almost entirely use dial up internet! DSL is available, but it's expensive and unreliable.

1a. Even at my home, the fastest DSL speed I can get is 3M. And it fluctuates between 0.1M and 3M, depending on the weather... or maybe because some guy at the local hub is goofing off.

2. A lot of older people still use dial up, too, because they see no reason to pay more higher speeds.

3. And don't forget viruses! My dad pays for 20M speed, but his computer is always full of viruses (I tell the guy to leave the Russian pr0n alone, but you know how it goes... LOL). He's always asking me to "fix it", and when I get there his speed is closer to dial up until I clean it for him.

According to this, there were over 800 million malware infections in 2018:

[purplesec.us...]

I don't know of any real way to tell a user, "hey, you have a virus or spyware that you need to fix", so all I know to do is to act like ALL of my users have slow internet, and make my sites as fast as humanly possible. Worst case scenario, my pages per session go up and I get more traffic from people that have slow internet speeds.

jetteroheller

7:52 am on Dec 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@csdude55 Seems US stopped developing a long time ago.
As I started with the internet autumn 1996, I had to pay US$ 4 per hour for a slow dial up connection. (converted from ATS to US$)
The price was lowered 1998 to US$ 1 per hour for dial up connection.
Starting 2000 I had ADSL. Because 2.5 km to the next distribution station with only 10 MBps. Was at last US$ 25 a month.
Starting 2020 I changed to mobile internet. An old smartphone with a SIM card for the unlimited internet tariff.
My family uses usual about 200 GB per month of this unlimited. It is 30 MBps for US$ 24 a month.
Smartphone tariffs are for example US$ 12 a month for 500 minutes speaking and 15 GB mobile internet.
It is a surprise for me, that dial up connections even still exists.
EU commission for competition has really hard penalties for cartels and negotiating of companies to have all the same high price.
Penalties can be even several billion US$.

iamlost

9:02 am on Dec 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@csdude55: your rationale is solid...

As I implied way back towards the beginning of this thread very little that Google recommends as best practice especially regarding usability and user experience is critical to SEO aka ranking in search returns (if it were many/most current query return sites would fail and be dropped); however it is critical to user engagement, conversion, and retention...

So many work so hard to get Google traffic but fail to deliver appropriately or with efficacy. Sad really.

iamlost

9:12 am on Dec 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@jetteroheller: There are differences and certainly the history and rationale of internet service differ however, both the US and Canada have dropped the high tech communication ball compared to much of the rest of the world.

Ah well
Que sera, sera
Whatever will be, will be
The future's not ours to see

Especially if we refuse to look...

jetteroheller

9:41 am on Dec 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@iamlost My last compare was about the situation in Austria. Even bigger is the difference of development in Romania. 1997 in Sibiu Romania, I had to wait for 11pm to have a 2400 baud internet connection. Too much noise in the wires at other times. I even lost a customer when I was in Romania, because it was impossible to download the email about a hosting problem in an internet cafe connected with a 19200 baud connection for 10 computers.
Now they have great mobile internet speed. My daughters watched videos while we had been driving on the highway without problems.

iamlost

12:58 pm on Dec 24, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@jetteroheller: yes I understood Austria by way of your mention of the schilling. I was just commenting on on the North American expensive and oft mediocre internet that has not gotten broadly better unlike what I hear from others in Europe and Asia. We have some areas that are state of the art and others with no access... Some due to shear geographic size of course but the cost is strictly opportunistic gouging and wilful regulatory indifference.
Hmmm. Button well pushed. Rant over .

Best wishes this holiday season.

csdude55

6:49 pm on Dec 28, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Starting 2000 I had ADSL. Because 2.5 km to the next distribution station with only 10 MBps. Was at last US$ 25 a month.
Starting 2020 I changed to mobile internet. An old smartphone with a SIM card for the unlimited internet tariff.
My family uses usual about 200 GB per month of this unlimited. It is 30 MBps for US$ 24 a month.

@jetteroheller , there's only one DSL provider that comes to my home, so until recently I paid almost $100 /month for 3Mbps. I recently signed up for T-Mobile home internet with unlimited data, and allegedly I get about 15Mbps for $100, but the ping time is considerably slower so it actually feels slower than the DSL did.

But this is a mountainous an rural area with a lot of mobile dead spots, so not everyone would be able to get mobile internet. When I go visit my parents there's no cell signal at all! So for them it's not really worth $100 for 3Mbps, they're happier using DirecTV and having dial up.

Technically, DSL is available for them... it's just not efficient. When you go to the more mountainous areas it's even worse, you might be paying $150 for 2Mbps.

I remember a couple of years ago, I met with a potential client that owns a Christmas tree farm to talk about building a website for him. When I got there I discovered that there was NO internet, though, so I couldn't show him any examples of ideas or anything! I couldn't even email design ideas to him, I had to print them off and mail them! LOL

jetteroheller

9:48 pm on Dec 28, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@csdude55 When will You change to StarLink? Let's built a Mars colony by some hundred million internet users.
100 million users times US$ 1000 a year is also US$ 100 billion a year

csdude55

5:06 am on Dec 29, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Countin' down the days! LOL But then again, I remember when WildBlue was bragging about giving high speed DSL to everyone, too... I tried it and it was slower than my dial up! All it accomplished for me was leaving 4 big holes in my roof when I took the dish down >:-(

I honestly can't imagine how they would be able to offer internet in this area, though, if mobile providers can't. They still have to get a signal through mountains and trees.

guarriman3

5:55 am on Dec 29, 2020 (gmt 0)

10+ Year Member Top Contributors Of The Month



My personal experience. No with HTML speed, but with the server's speed (TTFB).

I implemented HTTPS, and my server's OS did not support TLS 1.3 by default. I did not notice it, and several months afterwards I found that the 'Average response time' in Google Search Console had rocketed from 140ms to 320ms.

The HTTPS implementation was in March 2019, and my rankings dropped in June 2019. The crawl budget of my website had decreased, since Googlebot had less resources to crawl my URLs. Google started to stop indexing thousands of URLs of my site.
This 59 message thread spans 2 pages: 59