Skip to content

Monkeybruiser

My Experience With Site Optimisation

I know site speed is important, like, really important, so I thought I best dive a little further into achieving the ideal time-to-first-byte and aiming for 100/100 on Page Speed Insights. My conclusion is a mathematical one.

Website Optimisation = Head + Brick Wall - Blood.

That might be a little dramatic but holy cow, can it get frustrating. Let’s start at the beginning * harp music plays *.

The Basics

For every website I build I have a Gulp workflow which takes a bunch of JS and CSS files and outputs a 2 streamline versions of each language. One is totally optimised for production (all.min.js and style.css) and one is for debugging locally and in dev (all.js and style-source.css). The initial files are ran through compressors, uglify-ers and a few other Gulp tasks to get to their final state; nice and small!

For images, I have a similar Gulp task that takes all files in the images folder within the repo and smushes them down. For WordPress websites I have WP SMUSH running to optimise images as they uploaded to the CMS. Noice.

Finally, I have a ready-to-go .htaccess file which enables things like caching, gzip and such.

Up until now I thought that was enough to get me close to the 100/100 Page Speed Insights (PSI) score and within the 1 second TTFB target.

Negative.

Things to Consider

A few websites I have worked on recently have been hit hard by whatever changes have been made to the PSI algorithm (if any!) so the scores were coming out below 50/100. A little bit of research into the causes was required and the internet is full of helpful articles to aid with page speed.

As it turns out, starting with the server is pretty darn important. If you have a slow host and your queries aren’t optimal or being ran too many times your TTFB will be huge. Using plugins like Advanced Custom Fields is great for WordPress customisation but, depending on how you use it, retrieving the data from your database can be slow and cause issues. Figuring out how to streamline these queries would have been tough if it wasn’t pretty well documented.

I found that changing some loops had a positive effect whilst other actually increased the page load. I guess it’s finding what works best on a per-site basis for this one. Not such a straight forward easy fix but something that will come in time.

Note to self: Look into this further.

Moving on to the front end, images are a major source of pain. PSI really wants you to get those images compressed down despite having ran theme and uploaded images through ImageOptim multiple times. Thankfully PSI gives allows you to download the problem images for you to use. Not so thankfully is that some of the images you get are total garbage and utterly unusable so be careful when selecting what to use and what to bin.

I’m not quite sure how the images can get smaller than what ImageOptim outputs, I can only guess it involves some small loss of quality. I’m not always willing to sacrifice that, though so you can keep your 100 / 100!

Browser caching and render blocking files come up next on the chopping block.

Learn and move on

Taking the time to research into what might and might not help with page load times and speed scores was well worth it. My aim is to get this website’s load time as far below 1s as I can, get the PSI score up to 90+ and then apply that knowledge to every site I’m involved with.

The main issue I’ve had is that the PSI score and load times appear to fluctuate for no apparent reason, although I’m guessing it’s due to server load at the time of running the checks, which can be difficult to counter when your only option is a shared server.

Only time will tell if I’m successful with my targets or go bald from all the head scratching.

Some good references