WPO improvement: the importance of measuring and optimising the weight of static resources

Ramón Saquete

Written by Ramón Saquete

The technological advancement of web development allows us to make implementations that work increasingly faster. However, websites are becoming increasingly slower, due to heavier development frameworks and libraries, as well as design looking to be more visually striking, often without keeping in mind the consequences it bears on the WPO.

Let’s see how all this is affecting the web ecosystem, and what we can do to remedy it, which would notably improve our WPO.

The importance of images, CSS and JavaScript

When we talk about static resources, we mean images, CSS, JavaScript and fonts. And it’s their size, which is getting increasingly higher, making websites slower and slower. This isn’t a personal assessment, but a conclusion extrapolated from the CrUX (Google Chrome User Experience) data. For those not familiar with it, it’s a database with performance data Google gets through the users of its browser, which it openly publishes every month. Let’s see some charts from the source (https://httparchive.org/reports) where this data has been used, together with historical Internet Archive data (Wayback Machine’s open database):

Here we can see how certain metrics like FCP (First Contentful Paint) and the total download time have gradually grown over the past few years on mobile:

First Contentful Paint Chart

Websites are getting slower on mobile devices due to an increase in their weight 🐢Click To Tweet

On desktop, on the other hand, the download time has decreased, probably due to an improvement in transfer speed.

Even though pages take longer to be viewed and downloaded on mobile, the time elapsed until the content can be interacted with, that is, the TTI, has slightly improved. However, as seen below, we can’t attribute this improvement to pages being better optimised, so it’s more likely due to browser optimisations and more powerful CPUs:

Time To Interactive

To verify what I said in the previous paragraph, in the following chart we see the evolution of the total weight of the pages (axis Y – 1K equals to 1000 Kilobytes) and we see that the average weight of web pages is reaching 2MB. And while it’s true we have increasingly better and more reliable transfer speeds, thanks to 4G LTE or the upcoming 5G technologies, we shouldn’t bank on them, as we won’t always have the best signal, or the optimal conditions for reaching the maximum transfer speed with the technology in use.

We can also see that websites for mobile devices have matched the desktop ones in terms of their weight, due to the implantation of responsive design in 2011.

Total kilobytes

If we take a look at the following chart, where we can see the number of total requests, we can verify that they have increased to somewhere around 75, and stabilised. They even decreased slightly a little later, so the increase in size in the previous chart doesn’t imply we have a richer website, but instead, we see it use heavier resources, for example: unoptimised development frameworks, download of various fonts for displaying four icons, or images with full screen width, and videos not contributing relevant information to the user.

total requests

The size of the JavaScript is also an important factor, not only due to its download time, but also because this code has to compile and run, consuming CPU and memory resources, which affects page metrics related to interaction and painting. In the chart below, we see that the average size has reached 400KB, but it’s not difficult to encounter pages surpassing this figure by far and large, reaching one megabyte or even more, because, as we can see, the coefficient of variation in the results has also grown:

JavaScript Bytes

This growth in size of the JavaScript is almost definitely due to the widespread use of frameworks capable of more features, even though, most likely, we don’t even take advantage of most of them later. The issue arises when there is not enough budget to invest the necessary time into removing unnecessary code or delaying its load.

CSS frameworks are also problematic, given that a framework like Bootstrap weighs 150KB (around 20KB when compressed), which is 10 times higher than CSS, which usually needs one page to display a responsive design. It might not look like a lot to download, but is actually is a lot, considering the FCP metric depends on the time it takes to process this file:

CSS bytes

In the following chart, while there is a data registration error towards the end, we see that the image size is on the rise too:

image bytes

How to measure the weight of static resources

Having lighter static resources doesn’t necessarily lead to better metrics: the code we load and what it does also matters. Nevertheless, there will always be a strong correlation between their weight and the metrics we get on Google PageSpeed Insights, which will affect our rankings.

With Google PageSpeed Insight’s design change the Lighthouse extension has also been updated to a newer version, which is this tool’s data source. Now there’s a breakdown of the size of static resources on a website, which can be seen in the “Keep request counts low and transfer sizes small” section.

Lighthouse
In this example we can see that the size of JavaScript is considerably high, which is even higher than the total size of the images. This is actually commonplace nowadays, but in reality it shouldn’t be like that.

Another option is webpagetest.org. By clicking on the “Content Breakdown” link, after running an analysis, we get the following information:

Content breakdown on webpagetest.org

Lighthouse’s command line version also allows us to assign a performance budget to our static resources, that is, to establish a limit on the size of each type of resource, and reach our loading time goals.

Using the website https://www.performancebudget.io/ we can estimate how big should be the size of our resources, to load a page within a specific time limit with a specific connection type. With the obtained data, we can create a JSON file to set these limits in Lighthouse, or to simply keep it as a reference.

How to reduce the weight of static resources

In general, besides removing everything that isn’t used, it is always recommended to prioritise critical resources shown in the above the fold, to paint what the user sees first; to cache resources in the browser using cache headers and a service Worker, as well as delaying the download of non-critical resources. This is the so-called PRPL pattern: read as purple, it stands for:

  • Push (or preload) critical resources.
  • Render the initial route as soon as possible.
  • Pre-cache remaining assets.
  • Lazy-load and create remaining (non-critical) routes on demand.

Another technique we can apply to all the resources to speed up our download is a CDN, to prevent high latency times, especially when our website is open to audiences from various countries.

How to reduce the weight of CSS and JavaScript files

Ideally, we should use a web bundler like Webpack, Grunt or Gulp to reduce the weight, together with their corresponding extensions to remove unused parts of the CSS; apply Tree Shaking to JavaScript (tree shaking of dependencies consists in autodetecting code that is being used, and discard the code that isn’t), identify critical CSS, minify and version. If our website is made with a CMS like WordPress, Prestashop or similar, we can also find various plug-ins to carry out these tasks, more or less successfully. In the case of WordPress, we have: LiteSpeed, WP-Rocket, Autoptimize Critical CSS…

For example, with webpack we have plug-ins https://www.purgecss.com/ to remove unused CSS, and https://github.com/anthonygore/html-critical-webpack-plugin to detect critical CSS. Using them takes some setting up, and generating the website’s code in HTML, or at least its main templates. However, the initial setting up efforts are worth it, due to the great gains in terms of performance we could obtain. Its execution should be automised as much as possible to avoid delaying work flow, and time dedicated to this configuration of the development environment must be included in the initial budget.

How to reduce image weight

With regard to image optimisation, besides the recommendations given earlier, here’s a few additional techniques:

  • Application of the Lazy Loading optimisation technique, which soon is going to be activated by default on Google Chrome, and it’s going to be configurable for each image with the loading attribute, and the values “auto”, eager”, and “lazy”. Right now, we should ideally detect whether this feature is activated. If it isn’t, using the JavaScript API Intersection Observer (if it’s supported) we should implement Lazy Loading. The following URL includes a thorough explanation of how to do this https://addyosmani.com/blog/lazy-loading/. When applying this technique we should always place our images between the <noscript></noscript> tags, to make sure they’re getting indexed without any issues, as Google doesn’t always run JavaScript.
  • Use Chrome’s WebP format, to load images, which weight less when saved in this format. To load a picture in different formats and let the browser choose the most appropriate one, we can use the elements <picture> and <source>. We could also use the JPEG XR (.jxr) format of Microsoft Edge, and JPEG 2000 (.jp2) of Safari, although these two browsers don’t have a market share as big as Chrome. For example:
<picture>
<source srcset="/example.webp" type="image/webp" />
<source srcset="/example.jxr" type="image/vnd.ms-photo" />
<source srcset="/example.jp2" type="image/jp2" />
<img srcset="/example.png" alt="SEO text" />
</picture>

How to reduce font weight

We’ve dedicated a whole other article to font optimisation. I’m just going to state here that one of the best font optimisation techniques is to apply subsetting, which consists in removing unused characters of the font. To do this, we can use an online tool like this one https://everythingfonts.com/subsetter or an offline one used via command line like https://github.com/fonttools/fonttools.

Of course we should also remove unused fonts and font variants.

Conclusion

When optimising a website, historically, the most important thing has always been the TTFB of the HTML, but with the arrival of CSS and JavaScript frameworks, as well as increasingly heavier images, new user-oriented metrics and Google algorithm update called Speed Update has made TTFB to have as much importance as the space occupied by static resources, as they directly affect the time of painting and interaction with the page. Don’t get me wrong, though: TTFB continues to be very important, because unless we use HTTP/2 Server Push, it is going to determine when the rest of the resources are going to begin downloading. Basically, nowadays, we have to optimise everything on a website.

Ramón Saquete
Autor: Ramón Saquete
Web developer at Human Level Communications online marketing agency. He's an expert in WPO, PHP development and MySQL databases.

Leave a comment

Your email address will not be published. Required fields are marked *