Welcome to WebmasterWorld Guest from 34.204.189.171

Forum Moderators: open

Message Too Old, No Replies

Dense HTML/Slow Page Speed

How Do I Compress?

     
4:19 am on Sep 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1

Hi Guys,

The site I work for was built 11 years ago and has pretty dense code.

Page speed is slow. I've been on the developers for 2 years about cleaning the code, compressing images, minimizing java. Nothing has been done (they say their hands are tied.)

Now the Mobile Page Speed update and Mobile First Indexing roll out and I lost 90 top 3 spots in Google SERP in one evening.

I think it's the page speed. Does anyone have experience cleaning up/minimizing code to increase page speed?

Thanks in advance.
4:30 am on Sept 28, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


Hello Gregorich_SEO and welcome to WebmasterWorld [webmasterworld.com]

Read the accompaning articles in the report. Methods are discussed.
4:40 am on Sept 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


@keyplyr

Thanks! Has my first post followed all the rules of posting?
4:51 am on Sept 28, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15884
votes: 875


pretty dense code
Does “pretty dense” mean that a lot of stuff is being done in HTML that ought to be done in CSS instead? Does it mean that the whole thing is built on hideously convoluted php that takes ages to execute and drags down the server? Does it mean that there are vast numbers of large supporting files that could be reduced to half as many at a quarter the size?

Come clean now. Details. We’ve seen worse. (Sometimes on our own sites. Urk.)
4:51 am on Sept 28, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


If you are not already doing so, you should add compression for the various file types.

Here's a helpful guide: [gtmetrix.com...]
5:13 am on Sept 28, 2018 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:4515
votes: 350


Hi Gregorich_SEO, your post looks fine to me, the welcome link that keyplyr posted gives some tips on using the forums - it's is a big place and not everything is obvious at fist glance.

Is this a static html site that you work for? If it has not been changed in 11 years, that puts them very much behind the curve to do well in the Mobile First index. If the developers' hands are tied for altering the code there isn't very much you can do to improve its popularity for mobile friendly layouts and page speed. The owner (or owners) needs to know that there are not two separate indexes for Google, so if it isn't mobile friendly it won't compete very well.

As keyplyr mention, the reports in the Page Speed tests tell you what issues need work and give links to learn how/what to do. I would share those reports with the people you work for so they can see the causes of lower ranking.
6:52 am on Sept 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


@lucy24

Haha. Ok, I'll come clean. I actually don't know. I'm very inexperienced with coding. I've grown our site from 82K to 343K new monthly visitors in the past two years with on-page optimization, linkbuilding, fixing basic technical errors and matching user search intent with fairly good content.

But now that we've had our first big SEO attack it's time to dig deeper.

Would it help if I share a link so you can view page source? I'll put spaces so it doesn't trigger any spam filters :)

<snip>

@not2easy

We are mobile-friendly (reflexive) thank God. However we seem to still be on the wrong side of mobile first indexing (or at least the Google Medic update). Our page speed is definitely a hindrance (looking into other causes too).

Ran a test on WebPageTest and found all the images we need to compress, as well as some Java Script that's slowing us down. Our First Byte Time is currently a Fail. Told Dev tonight. We'll see what they say....

Thanks everyone for any advice you can lend!

[edited by: engine at 8:44 am (utc) on Sep 28, 2018]
[edit reason] Please see WebmasterWorld TOS [/edit]

6:53 am on Sept 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


Whoops, the site is called <snip>

[edited by: engine at 8:44 am (utc) on Sep 28, 2018]
[edit reason] Please see WebmasterWorld TOS [/edit]

8:14 am on Sept 28, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Would it help if I share a link

Yes but mostly no, because that's actually against the posting guidelines [webmasterworld.com], part of the Welcome to WebmasterWorld [webmasterworld.com] page that keyplyr linked to, so expect your post to be redacted soon.

I think it's the page speed.

I don't think it's page speed. Unless you're really slow, you're not going to drop 3 spots just for not being as fast as perhaps you could be. That doesn't mean it's not worth improving, though, and the biggest benefit to the experience of page speed is usually a reduction of the time-to-first-byte (TTFB), so I would focus on that first. Magento is notoriously slow. PageSpeed Insights can give you an indication of real-world page load distributions.

Just for kicks, use the Developer Tools to throttle your connection to Fast 3G, disable the cache, then reload the page and see if you're happy with how fast things are loading ;-)

[edited by: robzilla at 8:23 am (utc) on Sep 28, 2018]

8:17 am on Sept 28, 2018 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Sept 13, 2018
posts:355
votes: 71


Firs thought, too many external files.

Second thought, if you want to analyze your pages, to find out how you can optimize them, use the lighthouse tool from the Chrome Web dev panel. (the tab called "audit"), it will give you plenty of information of what can be done, to improve the page load.

Speed can also depend of your backend server, in case of dynamically generated pages.
4:37 pm on Sept 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


@justpassing Exactly! But we do have a "Slow" score from Page Speed Insights for multiple landing pages that used to rank for high volume keywords. Apparently, the Page Speed update only affects extremely slow sites. I have a bad feeling that we fall into that into that bracket.

That said, the #1 lesson from Google Medic seems to be, naturally, 'create relevant content', so I'm planning an overhaul of meta-data and content too. Our strategy was to add as much content to our category pages (we're ecomm) as possible, targeting our main keyword. Not sure that's going to fly now. All content should be helpful, not just optimized for a keyword. Reducing target keyword density, not using the keyword so much in H2s, creating helpful headers that help the reader scan straight to what they're looking for, and making sure every sentence matters might help. Also making the Title Tags more focused on the keyword. Some are long and full of multiple keywords rather than short and precise.

Thank you for your insight!

@justpassing

Digging into your suggestions now! Thanks! Will follow up.
8:27 pm on Sept 28, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:2091
votes: 370


Poor page speed is one of the quickest ways of creating a poor user experience, because it's visible even before your site is. Clearly, the site was built primarily with functionality in mind, or "developer ease" if you like (of course they'll say their "hands are tied", it's going to be a hassle to change things). If you're the SEO consultant here, I think you need to untie those hands and get everyone on the same page or you'll never get anything done.
8:37 pm on Sept 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


@robzilla

1000% agree. More and more that seems to be the case. And I feel I'm pulling teeth each time I share stats on our page speed with suggestions for optimization. But after this huge drop something HAS to happen.

Also looking into content-related fixes, as well.

And apparently someone blocked some elements with robot.txt that may be preventing Google from crawling our site (when I fetch and render, "what google bot sees" is blank). Joy. Getting that fixed today. If that's the cause, I'll follow up with better details.
9:28 pm on Sept 28, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15884
votes: 875


I feel I'm pulling teeth each time
Heh. Did the site owner think that the mere act of hiring an SEO consultant would magically lead to improvement, even if nobody has any intention of making any changes? I can't remember if you actually said you’re an SEO Guy, but it doesn’t seem as if you’re allowed to do anything, or compel others to do anything. I hope you weren’t just hired so they can flaunt you to the stockholders while taking no other action.
10:00 pm on Sept 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


@lucy24

So I was hired two years ago to write content and fix little errors (dup meta titles, 404s) in-house. But I took the opportunity to learn as much about SEO and Content Marketing as possible and we ended up growing a lot.

Traffic increased maybe 270% per year. Our little blog went from $1k a year to $120k in a year, too, which is why I'm super bummed that all the hard work might be for nothing after this drop.

I am SEO and Content Marketing Manager now. So my job is SEO. But the more I find out about SEO the more I realize how much more there is to learn (and adapt to constantly). I subscribed to SEJ, SER, and read everything Barry Schwartz writes. But still there are many what ifs, which is why I try to test different approaches. Just trying to be the best I can be at this stage. So all of your advice is very appreciated.

I've been on the page speed/html/java/image compression problem off and on for about two years. And little has been done except the new server which helped page speed a little. Many of my other ideas get the green light. I think it's because the code is just really hard to fix up or maybe they don't know how.

I'll keep you updated if anything interesting comes up.
10:19 pm on Sept 28, 2018 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:4515
votes: 350


If you use the Fetch as Google tool, two suggestions: Use the Mobile bot and select Fetch and Render (which it sounds like you've done). Then follow through to see what is being blocked. Keep notes on those files so you'll know what/how to allow them.

Since you've now mentioned it is at least partially a blog, I'm guessing the G-bots are not able to crawl your /wp-includes/ folder (guessing WP) to read the .css and .js files.
You can use
Disallow: /wp-includes/ 

and then allow:
Allow: /*.css
Allow: /*.js
so they still aren't welcome to dig through everything. Once you make changes to robots.txt be sure to use the (old GSC) robots.txt tester to verify that blocked resources can be read.

10:42 pm on Sept 28, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


@not2easy

Wow thanks!

I'll follow your advice and let you know what happens.

Making moves today!
2:48 pm on Sept 29, 2018 (gmt 0)

Preferred Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts:575
votes: 59


Are you using a CMS like Wordpress? Or custom flat file coding? Have you looked at your raw access log? This will quickly tell you if Google or some other search engine is having a tough time indexing you.
4:58 am on Sept 30, 2018 (gmt 0)

New User

joined:Sept 28, 2018
posts: 26
votes: 1


@TorontoBoy

Magento!

I will ask Dev to look into it.

But already found 4K and growing smartphone crawl errors in GSC. Mobile first indexing is rolling out and we are not able to be crawled because someone put robots.txt on the wrong folders. Waiting on dev to remove and then will fetch and pray.

Thanks for your help all the way from Toronto

Thomas
8:17 pm on Sept 30, 2018 (gmt 0)

Preferred Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts:575
votes: 59


I have worked with Magento. The issues you need to check include:
-is Magento core up to date
-is your Magento theme up to date? If it is 11 years old and unsupported that is a problem. Also look to see what version of Magento can run for your theme. If your theme cannot take the most current version of Magento then you cannot/should not update the core
-hacking and rehacking the Magento theme can become a dog's breakfast mess.

I worked with a company that had a Magento theme heavily hacked by some Indians. Sure it worked, but if anything in the input was different the system would do odd things.

Relying on Google Search Console to monitor your site is inferior to monitoring your actual raw access log. You do not need to be a programmer to do this. Monitoring this log would be a huge advantage to an SEO person. These logs are part of cPanel and external to the development environment. You should be able to get access to this and learn how to read it.