|brotherhood of LAN|
| 2:45 pm on Dec 18, 2013 (gmt 0)|
Does the tool give you a breakdown of where the speed differences are?
Being on or off a shared server generally doesn't matter. It really depends on the hardware and the demand on it.
| 3:19 pm on Dec 18, 2013 (gmt 0)|
Looking at the page peed insight from google we get an 85/100 when our competitor only gets a 70/100.
But if look at our page speed and their page speed via this tool [webpagetest.org] I did the test via this link and choosing virginoa, aggressive we have a load time of 5.278 s when they have a load time of 3.269 s.
It means that even though we have a better grade than them 85/100 we have a slower load time… and I don't understand why.
Then if I look at what google calls the speed index we have 4976 when they have 2941.
My question is how can we decrease our page load time and speed index ?
I am just wondering if the fact that we are higher on everything can be due to the server we are hosted on ? shared instead of not shared for our competitor
The fact that we don't use CDN ?
The quality of the server we are hosted on ( what type of processor etc… )
The size of our images on our website but it doesn't seem to be the case because i tried deactivating all our images and modules and the load time and speed index are still very similar.
I am just trying to figure out and eliminate the possibilities one by one.
| 3:34 pm on Dec 18, 2013 (gmt 0)|
You really should use a test that tells you specifically how long each element of your page takes to load (I like the tools that offer a "waterfall", but we're not supposed to mention individual tools here)
Google's tool should be giving you some insight as to what you could improve, and make sure you look at BOTH the desktop and mobile suggestions.
Your host can make a difference (but it doesn't generally tend to matter whether or not it's shared hosting, at least not in my experience) It just as easily could be how their server is configured as to resources.
There are lots of things you can do, but you need to figure out where you need the most work.
Personally, I've exhausted my ability to tweak the caching plugin I use and it's still not good enough, so after I get done with this year's design and coding changes, I'm going to pay the authors to come in and configure it for me once and for all (and tell me what else I need to fix) That's how important I think page speed is.
| 4:10 pm on Dec 18, 2013 (gmt 0)|
I see what you mean by the water fall too. I just checked and for my website the longest request I have "the web browser is waiting for data from the server" is for my homepage 682 ms my script merge 2.36 s and templates 158 ms ( are those slow ? ) and how to improve them ?
Then I have the DNS looking 273 ms and connecting 372 ms, and I have a total of 90 requests is that a lot ?
|brotherhood of LAN|
| 5:09 pm on Dec 18, 2013 (gmt 0)|
90 requests is a fair bit, and you could perhaps save a little by merging JS files or turning smaller images into a sprite. The main thing is that the pages load quickly for your target market, i.e. as little a number of people as possible would click away from your site due to load times.
Try a 'speed test' from your server. If you have adequate bandwidth from the server then you can focus on the page and site itself.
Bear in mind that the location of these speed tools and the location of your server affect the results given, and also for comparison to other servers/sites.
| 6:03 pm on Dec 18, 2013 (gmt 0)|
|I've exhausted my ability to tweak the caching plugin |
Which one you use? Super cache or w3 Total?
| 6:27 pm on Dec 18, 2013 (gmt 0)|
W3 Total on all my heavy traffic sites that really need advanced tweaking. Super Cache on the less complicated sites.
| 9:09 pm on Dec 18, 2013 (gmt 0)|
|( I like the tools that offer a "waterfall", but we're not supposed to mention individual tools here) |
Mod's note: netmeg, let's make an exception, in this thread only, and allow mention of tools that can be used... but, for those less experienced, let's also include a description of how to use the features we like and why a tool helps in particular situations.
I ask also that we keep it non-promotional, and that no one recommend their own tools. (I'm also going to exclude first time posters from making recommendations here.)
| 10:09 pm on Dec 18, 2013 (gmt 0)|
|I have a total of 90 requests is that a lot |
Yes. But you should be asking "Is that harmful?"
90 images, on the other hand, is not automatically bad. If they've got proper width and height declarations, the page can display all around them while they're loading up. In a long page, the user may not even notice.
90 requests to 90 different domains is worst of all. Ever been to a page that was entirely filled with hotlinked images? It doesn't just make the victims mad; it's a dreadful experience for the user.
| 11:08 pm on Dec 18, 2013 (gmt 0)|
|netmeg, let's make an exception, in this thread only, and allow mention of tools that can be used |
'k, I like this one:
Run your site through that; then you can sort the results by load order, by file size, by URL, or by load time. Real easy to find the resource hogs fast.
At the top is the TL:DR - your overall performance grade. I just ran WebmasterWorld through it and despite 106 requests, it scored at 93/100 overall, with a load time of 2.23 seconds. Of course, we don't have a lot of images here either; that makes a difference.
Then go the 'waterfall' and mouse over the measurement - you'll get the time in milliseconds for each resource there - DNS, connect, send, wait, receive, etc. Got slow DNS issues? It should show up here.
Back up above the waterfall, you'll see tabs that include performance (get a "grade" for possibility of performance optimization), Page Analysis for specific URLs, and if you've run your site through it more than once, a history.
Obviously since I didn't oversee creation of the tool, I wouldn't consider it flawless (nor is Google's tool) but I think this one can at least point you in the direction of areas of concern.
I also like [urivalet.com...] (created by a WebmasterWorld member that isn't me)
This one doesn't grade you on performance, it just lists everything that loads.
I never use just one tool, I test stuff in everything I can find.
| 5:47 am on Dec 19, 2013 (gmt 0)|
Have you tried preload mode in Super Cache? check it out if you haven't. If you want, I can send you screen shots of the settings.
Preloading was one the best things I did to improve my sites' speed.
| 1:25 pm on Dec 19, 2013 (gmt 0)|
Yah, we did that. For the main sites, we do better with W3, but it's a bear to configure.
| 8:45 pm on Dec 24, 2013 (gmt 0)|
|let's make an exception, in this thread only, and allow mention of tools that can be used |
Besides the tool mentioned by member22 himself, I heavily use zoompf. They have a free checking tool (allows you 3 tests per day) which gives you a very good overview of where you can improve your site (coding or server).
Hope mentioning of this tool is allowed (not affiliated), otherwise please delete/moderate this first paragraph.
From my experience, switching on Server side compression helps the most, followed by optimizing Images and minimizing js and css code.
| 2:26 pm on Dec 25, 2013 (gmt 0)|
I've used GTMetrix
One heavy day working with that got all my pages loading 20-30% faster and we seem to have been doing better in SEs since then too.