Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

The obsession with Page Speed

         

GoogSay

9:38 pm on Oct 11, 2020 (gmt 0)

5+ Year Member



I thought I would write this as a warning to other webmasters. I like many of you have used Tools for page speed like pagespeed insights, lighthouse, pingdom and horrified by the scores decided something must be done. So I installed plugins that cached, deferred js, moved css to the footer, combined files, minimalized and gziped files. My score went up significantly. However I started noticing 5xx errors in GSC and my site became horribly responsive for uncached queries. I felt like giving up, I needed plugins to get my pagespeed passing but the same plugins were causing the site to fail. I was in catch 22 I need speed to rank on Google and the same plugins caused the site to fail in GSC.

So I looked into it, I mean how important a factor is it. To my surprise it turns out for ranking not to be a big deal. Gary Illyes from Google actually tweeted about it saying "Ranking wise it's a teeny tiny factor, very similar to https ranking boost.". Additionally I learned from Martin Splitts (Google) video on page speed that Google only determine if a site is very slow of not. So I stripped out all the plugins and went back to wp at its core. No more errors and a reasonable load time for all queries.

Yes I would ideally like the page faster, but its a new site, shared hosting and few visitors at present. As the site hopefully gets popular it could fund a dedicated server and bespoke speed optimization, but first I need to learn to walk.

Robert Charlton

11:04 am on Oct 12, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



GoogSay, thanks for your thoughtful post, and welcome to WebmasterWorld.

Way back, when Google first started paying attention to mobile loading speed, Matt Cutts made some statements which I believe are still operative, and which we discuss in this thread from a couple of years ago...

Google Page Speed Update Rolls Out
July 9, 2018
https://www.webmasterworld.com/google/4910195.htm [webmasterworld.com]

In the discussion, I note Matt's emphasis... that page speed is going to be an ever-moving target, based on technologies generally available at the time and place being considered...
(Google) is looking at load speed in the context of other factors. As Matt suggests, it's important that you don't be an outlier, either among your competitors, your niche, or in your location.

By "outlier", we mean noticeably slower than competitive sites. Worth checking out the definitions of "outlier" in Google to get a sense of how the term is used.

The above thread is also filled with references worth checking, reading, and watching... including reference to earlier articles and threads, and to the earlier Matt Cutts video on the topic, where I think Matt gives a very nuanced sense of Google's approach to site speed,

Obviously, for the user, the faster the page the better, but your well-researched approach makes sense. Without getting into domain specifics, you might want to compare notes on performance with others here...

I think we can also allow comparisons of plug-ins or other technical aspects of solving your problem. Plug-in specifics are allowed in the WordPress forum, and while we like to avoid tool specifics in this SEO News forum, I think it's OK to discuss WP plugins in this case.

It sounds like you've been very thorough, but possibly there's also a way of solving your problem without a plug-in.

engine

12:09 pm on Oct 12, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Thanks GoogSay.

For a while now people have become almost obsessed with the need for speed, and, in part, it is a result of Google banging that drum for a long time.

It originally came about through poor mobile download speeds, especially in some regions. And then came google's AMP to help resolve that, and everyone chased after AMP pages. Ironically, most of the sites with plenty of money behind them had pretty fast sites, so it was only the heavy graphics which really caused most of the problems for them.

The need for speed is way less important than it was before mobile infrastructure improvements.

robzilla

12:38 pm on Oct 12, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Obviously, for the user, the faster the page the better

I agree with the emphasis here. This should be the primary focus of webmasters: page speed in service of users, not in service of SEO. Ideally, Google should not even factor into this; it's between you and your users. Unfortunately, many approach this backwards and often haphazardly in a short-term effort to "please Google". Time and again these algorithm changes (HTTPS, page speed, CWV, etc.) reveal a tenacious reluctance to independently focus on providing a better user experience. It's only under "threat" from Google that they make changes, and so Google is always at the center, taking all the heat, when a much healthier approach would be to understand why Google chooses to prioritize these things in the first place (short answer: it's necessary). And let's not forget they help us immensely by providing free documentation, tools and new technologies that we can use to improve our business (if we choose to). [/rant] (In fact, I think a great way to trump your competitors is to attend to these things before Google suggests you do so.)

(This wasn't aimed at you, GoogSay.)

but first I need to learn to walk.

I think that's important, too. Although we have lots of great tools available to us these days, optimizing page speed can still be a technical challenge, and a minefield of sorts because certain techniques (or plug-ins) can indeed break things. And speed doesn't have to be a top priority: if you're seeing a "reasonable load time for all queries", you're not likely an outlier, and then perhaps there are other things to attend to first. So while I think an obsession with page speed can be healthy (I find it a fascinating field and might qualify), I agree with you that page speed for SEO is certainly not worth obsessing over.

Lexur

7:26 am on Oct 13, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think Google should penalize one of the players that slows down the traffic of the whole Internet: Adsense, its redirections and its heavily coded and multimedia ads.

robzilla

8:39 am on Oct 13, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think Google should penalize one of the players that slows down the traffic of the whole Internet: Adsense, its redirections and its heavily coded and multimedia ads.

Isn't that essentially what they're doing? On the off-chance that a position is between one page with AdSense vs. one without, and it's coming down to page speed, AdSense is more likely to lose, which is a penalty of sorts. But the Chrome data that is fed to the algorithm is agnostic and reflects an overall experience, it doesn't care about the cause of poor performance.

engine

9:07 am on Oct 13, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I think Google should penalize one of the players that slows down the traffic of the whole Internet: Adsense, its redirections and its heavily coded and multimedia ads.


That does happen.

It's the users that need to be the ones to decide, and not google, imho. I do understand the need to improve a site, but, really, are sites that bad these days. Yes, some are, but, now that flash has, in effect, gone, it's mostly bad hosting, or excessive ad loading.

If I get a site that takes too long to load, I back out. Simples!

iamlost

11:03 am on Oct 13, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google channels the villainous Two-Face: one side is all the best practice UX advice, the other is the weight (rather lack thereof) given to each.

Page speed aka time to render is a prime example: each second of additional render time loses visitors yet Google has been quite clear that in its algorithm weighting it is almost always a tie breaker rather than a definitive threshold input.

The rationale is obvious: if Google actually used render time as a primary input well over half their index would vanish from results including almost all the too big to be excluded sites. Yes, speed is UX and CR critical and yes speed is not (usually) query return important.

While the two biggest page weights have remained surprisingly constant (images at ~60%, scripting at ~15%) page size has jumped from ~0.75MB in 2010 to ~4MB in 2020.

Also, the switch to mobile has an inherent problem in that it takes about twice as long for a given site to load on a mobile device as on a desktop: average time to first byte desktop 1.3sec, mobile 2.5sec; average perceived load time (first viewport rendered) desktop 4.7sec, mobile 11.5sec.

For Google (NOT for visitor) there are two render thresholds a webdev needs to be aware: (1) web averages as illustrated above and (2) niche aka query competitor averages. So long as a site renders better than average site render speed/time is not an SEO concern.

Of course such times are horrendous in a visitor conversion retention revenue context, eg bounce rate typically more than doubles comparing a 5–sec time against a 1-sec time.

Best practice and SEO as typically practiced are not synonymous.

engine

11:38 am on Oct 13, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Yes I would ideally like the page faster, but its a new site, shared hosting and few visitors at present. As the site hopefully gets popular it could fund a dedicated server and bespoke speed optimization, but first I need to learn to walk.


You're correct. One thing I would do is to minimise the number of plugins, both from a speed and security.

In my view, site optimisation is a given. Yes, i admit, i did create Flash-based sites way back, but I very soon dropped them when the buzz went. But Flash wasn't the real problem back then, it was user connections to the Net. As that improved, along came mobile with yet another narrow pipe.
Without an optimised site i wouldn't expect visitors to hang around, although, if the site was unfairly penalised by G i'd be frustrated by that move. Looking at some of the speed test tools there are a lot of very minor errors which wouldn't stop a visitor, but it might trigger a G demotion.

Steven29

8:10 pm on Oct 13, 2020 (gmt 0)



"Isn't that essentially what they're doing? On the off-chance that a position is between one page with AdSense vs. one without, and it's coming down to page speed, AdSense is more likely to lose, which is a penalty of sorts. But the Chrome data that is fed to the algorithm is agnostic and reflects an overall experience, it doesn't care about the cause of poor performance."

It appears as if everyone is saying that page speed is no more important than a signal like switching to HTTPS. There is also references to being an "outliner" and that having more impact for updating page speed.

So by the off-chance there is a position between one page with Adsense and one without, wouldn't page speed not matter unless the page with Adsense is an "outliner" in the niche? Otherwise it's just a signal like being HTTPS and the other 100+ signals will be used to determine the ranking?

In regards to all this optimization, it's almost like we are going back in time. Things like a simple sticky menu and other things for better user experience will hurt your metrics. This is also a catch 22.

lucy24

9:07 pm on Oct 13, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Ranking wise it's a teeny tiny factor, very similar to https ranking boost.
Well, that's a universal truth, isn't it. If I minified my css it would shave a few nanoseconds off download time--for humans and googlebots alike--but at what cost to my own sanity. And if some one specific thing were a huge ranking factor, nobody would ever talk about anything else.

robzilla

11:27 pm on Oct 13, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



With each added signal, all the pieces of the pie get a little smaller, and the algorithm a little more robust (presumably). We can reasonably assume that the page speed and HTTPS chunks aren't as big as those related to content quality and authority, but many small pieces can add up to a mouthful of a good user experience, and I believe that's where you can make a real difference, where you have the chance to tip the scales in your favor, especially when all else is pretty much equal.

So by the off-chance there is a position between one page with Adsense and one without, wouldn't page speed not matter unless the page with Adsense is an "outliner" in the niche?

As with other signals, I would expect the page speed signal to become more weighty the further a page floats into outlier territory. And I suppose that could go either way: good outlier vs. bad outlier. In other words, the smaller the speed difference between two pages, the less likely the signal is to determine who comes out on top.

Things like a simple sticky menu and other things for better user experience will hurt your metrics.

Well, strictly speaking you'll always be "outperformed" by a "Hello world" page, of course. Does your sticky menu affect your page speed in a significant way? I should hope not. In much the same way that Google seeks a balance between all the aforementioned chunks of its pie, so too should you strive to find a good balance between content, aesthetics, user experience, speed, etc.

Steven29

5:05 pm on Oct 14, 2020 (gmt 0)



"Well, strictly speaking you'll always be "outperformed" by a "Hello world" page, of course. Does your sticky menu affect your page speed in a significant way? I should hope not. In much the same way that Google seeks a balance between all the aforementioned chunks of its pie, so too should you strive to find a good balance between content, aesthetics, user experience, speed, etc."

You wouldn't think so right? But it appears to, as it seems to be causing my mobile pages to fail the "Good" test by having a CLS of .10. These pages "need improvement".

I have gotten the sticky menu to have a CLS of .02 or lower with 6 CPU slowdown on mobile. For Desktop there is none or sometimes .0001.

But for some reason, Google seems to keep failing the mobile pages and there is no other CLS I can see or detect on any browser or device.

So apparently that up to.02 layer shift is dinging my pages for whatever reason.

From what I can see, even Google does not use sticky menu's for mobile and the one's that do, have a much higher cls.

[support.google.com...] I receive a .06+ CLS just on page load, but do not have additional CLS when scrolling up and down.

Who is scrolling up and down 5+ times on the pages? Is that how the Google tests run? Load the page and scroll up and down 5 times?

It appears they want us to use new browser technology, for mainly Chrome that is not supported everywhere with fallback measures for browsers that do not have the options - to force them to adapt the changes to their browser.

robzilla

7:21 pm on Oct 14, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That's getting a little too specific for this thread, but with the CLS scale running from 0-0.1 (Good) to 0.1-0.25 (Needs improvement) and 0.25+ (Poor), a score of 0.1 doesn't look too bad. If you can fix it, great, if not, your time is probably better spent elsewhere :-) The Performance tab in Chrome's Developer Tools is most helpful in discovering layout shift. There's also this nifty tool [defaced.dev] that visualizes layout shift in an animated GIF, although I'm not sure to what extent it corresponds to CLS measured by Lighthouse, for example.

Steven29

7:36 pm on Oct 14, 2020 (gmt 0)



Sorry, i'm a detailed person :)

The score I thought is 0-0.1 GOOD, 0.1 - 0.25 needs improvement. But the score appears to be 0-0.099 GOOD 1.0 - 0.24 needs improvement.

The Performance tab in Chrome's Developer Tools and Internet Explorer and all the others is what I am using. With that website you linked to it reports a CLS for both mobile and desktop of 0.0.

robzilla

8:06 pm on Oct 14, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That's like having a BMI just outside of the "Normal" range. It doesn't make you unhealthy :-)

Robert Charlton

1:45 am on Oct 15, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Steven29, since you are a detail person, this may be a good time to remind you of the word outlier, which I had bolded and otherwise emphasized in my post above, and which Matt Cutts had also emphasized. The examples you give are in no way outliers.

A common simplified definition of "outlier"...
In statistics, an outlier is a data point that differs significantly from other observations

Here, special note should be taken of the word "significantly"... Significance is not always the same for all measurements in all data sets, and it is affected by context.

Two articles, of many possible, perhaps worth noting... from Wikipedia, which I've just quoted above, and from Wolfram MathWorld. Both have examples, formulae, graphs, etc...

Outlier
[en.wikipedia.org...]

and...
Outlier
[mathworld.wolfram.com...]

From Wolfram....
Outliers are often easy to spot in histograms. For example, the point on the far left in the above figure is an outlier.

A convenient definition of an outlier is a point which falls more than 1.5 times the interquartile range above the third quartile or below the first quartile.

For non-mathematicians, the concept of "outlier" is much easier to visualize, and the Wolfram article's illustrations should pretty much jump out at you.


(Edited to fix typo.)

RedBar

1:48 pm on Oct 15, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A personal experience of the past 26/27 years regarding page speed.

For about the first 15 years my page speed was driven by my desire to keep everything under 100K, bear in mind my pages have widget images. Over that time I have attended many trade fairs and asked fair visitors about their experiences with our websites.

All agreed that the sites were easily amongst the fastest but could we use larger and better quality images?

So about 10 years ago I experimented with some of our smaller sites using better images which are obviously larger files. Traffic continued as before plus image rankings improved.

When I converted my main B2Bs to html5 some 6/7 years ago I took the plunge and increased all widget image sizes. Not only did already excellent rankings improve, so did traffic and genuine customer enquiries and subsequently orders.

As many of you know just over a year ago I started a massive "overhaul" of my B2Bs with them being melded together since 1st September. Not only has each page more images they do have more information resulting in many, widget #1 rankings now and a huge increase in traffic / enquiries.

Even with these extra images I have very few pages of more than 300K therefore compared to 26 years ago they are blazingly fast in comparison.

Is page speed important, of course it is but don't obsess about it, probably quality web hosting is more so.

nomis5

4:28 pm on Oct 15, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Don't ignore page speed but remember this. For most of us Google Analytics informs us of slow / fast page speeds.

GA is the very thing which, certainly in my case, slows down page speed. G gains so much information from GA on our pages that they would be cutting their own throat if they made page speed a dominant ranking factor. Page speed will always be a very minor ranking factor for this reason.

Plus, I agree with everything which RedBar writes above.

robzilla

5:19 pm on Oct 15, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



GA is the very thing which, certainly in my case, slows down page speed. G gains so much information from GA on our pages that they would be cutting their own throat if they made page speed a dominant ranking factor. Page speed will always be a very minor ranking factor for this reason.

No, it will be a minor factor because there are other signals that are much more important in answering a query than page speed. In other words, it doesn't make sense for it to be a major factor, and that has absolutely nothing to do with the possibility of them "cutting their own throat". Besides, if Google Analytics slows down your pages, something is wrong. It loads after the DCL event, and most people will have the .js cached. The other GA, Google AdSense, will, of course, slow you down a bit, but that's your responsibility, and Google Search is not going to not consider page speed as a signal because some AdSense-powered sites may suffer.

Steven29

5:32 pm on Oct 15, 2020 (gmt 0)



I hear you and understand what you are saying about "outliner", but I think the opposite.

Is it not possible to make your competitors an Outliner by optimizing your website?

"In statistics, an outlier is a data point that differs significantly from other observations"

Wouldn't this push the websites that do not have passing page speed tests, fail the core web vitals and other signals that are basically below an F grade stand out even more and get labled as "outliners" in the niche?

Most pages in my niche have a page load time of like 5 seconds - 10 seconds and longer for desktop and up to 35 seconds for mobile. At what point is something labeled an "outliner" when there is a website in the niche with a perfect score?

Speed indexes of 17+, 20+ etc.

My speed index is 1.4 mobile 1.1 desktop.

These websites are all just copying everything I post too. The name of the game seems to be "Wait until Steve posts and then copy it to 4 or 5 sites - one will stick, especially if we make our post times 30 minutes before yours" "Thanks for doing the work for us".

Robert Charlton

8:43 am on Oct 16, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Steve... I think that your spell-checker may be kicking in big time ;) ...and/or you're not quite getting the essence of what the word is about. Maybe a little of both....

- "An outlier is an observation that lies outside the overall pattern of a distribution."

- "An outliner (or outline processor) is a specialized type of text editor (word processor) used to create and edit outlines, which are text files which have a tree structure, for organization."

You wrote...
Is it not possible to make your competitors an Outliner by optimizing your website?... Wouldn't this push the websites that do not have passing page speed tests, fail the core web vitals and other signals that are basically below an F grade stand out even more and get labled as "outliners" in the niche?

Let's forget "Outliner"... it's the wrong word, and... most important, it's not describing the concept of "outlier", which is what we're discussing.

While you're sort of getting the idea, I think you might be conflating the term with other ranking factors, and I think with your sense of justice. If you're the only site at a certain point of time, location, and technology running as fast as you are ... loading faster than these scrapers... then site speed alone may not help you.

I think you would like to have special virtue in the technical area cause you to outrank your competition, but IMO site speed doesn't kick in until the technology overall has caught up. That's the point the OP is making. For a while, until many more sites do speed up, then only the really, really slow ones... the "outliers", as I suggested... are going to get downranked because of speed.

The discussion of scraping, etc, and your particular issues, really are off-topic to what we're discussing.

ronin

10:26 am on Oct 18, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



With regard to speed / performance, one easy win I'm intending to implement across all my sites before the end of 2020 is:

loading="lazy"


on all images.

I see from CanIUse (https://caniuse.com/loading-lazy-attr) that this is now implemented on Chrome, Firefox, Edge and Opera.

That's good enough for me to adopt it as a standard practice from now on. (I'm sure Safari will catch up eventually).

JorgeV

11:14 am on Oct 18, 2020 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Hello,

That's good enough for me to adopt it as a standard practice from now on

Indeed. But , in all events, there was no problem, using this attribute earlier. If a browser doesn't support it, it just ignores it.

By the way, it's a great addition. Even if there was all kind of way to achieve this in javascript, it's still better if it can be handled by the browser itself.

lucy24

3:49 pm on Oct 18, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



loading="lazy"
I suspect that by the time everyone has scrambled to add this line to their HTML, the behavior will have become standard practice among browsers anyway. I regularly notice visits from mobiles that only download the first few images. (And then presumably slam the window shut before they get to the rest of the page, which is annoying but does save the server work.)

archiweb

4:50 pm on Oct 18, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



The 5xx errors are the result of leaky abstractions in your entire setup.

A CMS with a hacky plugin is not the most effective fix for such architectural problems.

In a perfect world—all css, js, and the html response should be stored in memory—via caching layer such as Varnish, while images/media—in most cases—should be served by the webserver.

This will improve your page responses to a point where most kinds of plugins for your CMS won't be necessary, unless the theme/frontend is a mess.

If you are going after a really fast website with an excellent security score—forget about webfonts, inline css, js, svg, etc. And make sure you have properly configured Content Security Policy (CSP), Strict-Transport-Security, etc.

Also, loading="lazy" works best with width and height attributes, otherwise you will still get layout shifts.

Just cobbling together a few plugins on your CMS won't cut it—or at best—you will start getting funny willies, waste hours, replicating/debugging for no real gain.

rthree

9:50 am on Oct 23, 2020 (gmt 0)

5+ Year Member



Instead of loading="lazy" you may use loading=lazy

It saves you 2 bytes ;) Quotes are optional in HTML5

I managed to get my website under 1 sec for mobile and under 0.5 seconds for desktop, based on field tests. It is a sh*t load of work to get at that point. I'm so sick that I even optimize headers and shift around with code to optimize compression. Its a sport to find things to be optimized although it is getting difficult now.

I see its worth it, most sites in my niche are ridiculous slow, they have irritating popups so its just a matter of time until I outrank them.

Just checked my page with most traffic using pingdom, it has 1 image, total size is 42.6KB and the load time is: 108ms with 5 requests.. all from another country.

Dimitri

11:44 am on Oct 23, 2020 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



You can also omit the last semicolon at the end of each CSS block.

explorador

2:03 pm on Oct 23, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Not as simple as it seems, from where I'm standing, this directly relates to some of the OLDER threads in this forum where several factors are mentioned in order to build a great website, it's a list of several "small things" and what matters is the result as a whole, here we might consider:

- 1. The small specific picture (speed directly related to ranking)
- 2. And the bigger picture (the whole thing), this one covers everything.

Robert Charlton: Obviously, for the user, the faster the page the better, but your well-researched approach makes sense. Without getting into domain specifics, you might want to compare notes on performance with others here...

Exactly. Regarding #2 I would stick to: it's better for the user, the server, the bandwidth, the clients and obviously the website owner, and sure, it's also better for the budget.

An optimized website not only involves what's served, it also covers how you serve it and how the data is combined, depending the case it could also mean LESS RESOURCES being used, more efficiency, and thus being kind to your server. I would still insist on speed optimization and your site being light, if we do things correctly, we will be facing more traffic and this would mean more pressure on the server, so we better make it fast. In other words: if you get the ranking and traffic you are looking for, you will surely need your site to be faster to handle that new traffic. We should not forget people won't exactly use the hardware we are using to test the websites, sometimes it's low end, and while the website is delivered fast (let's say zipped and with specific CSS stuff), it has to be decompressed and then painted on a low end device and THAT could turn your site into a slow experience. Even diff CSS approaches might turn into a slow or fast website.

All websites are created differently, diff goals, diff technology and strategies, but ideally we should consider a lot of "what if" regarding speed. Many webmasters aim to make it big on traffic and ranking, but fail to design the website as something that fits that goal, so when they finally get there... they have to start all over because the site as it is, can't handle it. It's not a pleasant task needing to rebuild your own website because something was missing at the time of planning for traffic.

GoogSay: I felt like giving up, I needed plugins to get my pagespeed passing but the same plugins were causing the site to fail. I was in catch 22 I need speed to rank on Google and the same plugins caused the site to fail in GSC.

I have nothing against WP, as discussed on diff threads: it's a tool, but when it comes to optimization, it is limited and will need not only extra tools, but some hosting services won't allow it to grow. Most sites that handle lots of traffic, or get a huge hosting service, or get a decent CMS specifically designed to handle the specific needs.

Several forum members here got this going on their websites using specific tools, some wrote them themselves. I believe it is a natural evolution of many webmasters to start with a tool and perhaps eventually migrate to something specific.

explorador

9:48 pm on Oct 23, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Efficiency, faster, better. The internet also produces CO2. [alistapart.com...]
This 59 message thread spans 2 pages: 59