Written by Ramón Saquete
Table of contents
The technological advancement of web development allows us to make implementations that work increasingly faster. However, websites are becoming increasingly slower, due to heavier development frameworks and libraries, as well as design looking to be more visually striking, often without keeping in mind the consequences it bears on the WPO.
Let’s see how all this is affecting the web ecosystem, and what we can do to remedy it, which would notably improve our WPO.
Here we can see how certain metrics like FCP (First Contentful Paint) and the total download time have gradually grown over the past few years on mobile:
Websites are getting slower on mobile devices due to an increase in their weight 🐢Click To Tweet
On desktop, on the other hand, the download time has decreased, probably due to an improvement in transfer speed.
Even though pages take longer to be viewed and downloaded on mobile, the time elapsed until the content can be interacted with, that is, the TTI, has slightly improved. However, as seen below, we can’t attribute this improvement to pages being better optimised, so it’s more likely due to browser optimisations and more powerful CPUs:
To verify what I said in the previous paragraph, in the following chart we see the evolution of the total weight of the pages (axis Y – 1K equals to 1000 Kilobytes) and we see that the average weight of web pages is reaching 2MB. And while it’s true we have increasingly better and more reliable transfer speeds, thanks to 4G LTE or the upcoming 5G technologies, we shouldn’t bank on them, as we won’t always have the best signal, or the optimal conditions for reaching the maximum transfer speed with the technology in use.
We can also see that websites for mobile devices have matched the desktop ones in terms of their weight, due to the implantation of responsive design in 2011.
If we take a look at the following chart, where we can see the number of total requests, we can verify that they have increased to somewhere around 75, and stabilised. They even decreased slightly a little later, so the increase in size in the previous chart doesn’t imply we have a richer website, but instead, we see it use heavier resources, for example: unoptimised development frameworks, download of various fonts for displaying four icons, or images with full screen width, and videos not contributing relevant information to the user.
CSS frameworks are also problematic, given that a framework like Bootstrap weighs 150KB (around 20KB when compressed), which is 10 times higher than CSS, which usually needs one page to display a responsive design. It might not look like a lot to download, but is actually is a lot, considering the FCP metric depends on the time it takes to process this file:
In the following chart, while there is a data registration error towards the end, we see that the image size is on the rise too:
How to measure the weight of static resources
Having lighter static resources doesn’t necessarily lead to better metrics: the code we load and what it does also matters. Nevertheless, there will always be a strong correlation between their weight and the metrics we get on Google PageSpeed Insights, which will affect our rankings.
With Google PageSpeed Insight’s design change the Lighthouse extension has also been updated to a newer version, which is this tool’s data source. Now there’s a breakdown of the size of static resources on a website, which can be seen in the “Keep request counts low and transfer sizes small” section.
Another option is webpagetest.org. By clicking on the “Content Breakdown” link, after running an analysis, we get the following information:
Lighthouse’s command line version also allows us to assign a performance budget to our static resources, that is, to establish a limit on the size of each type of resource, and reach our loading time goals.
Using the website https://www.performancebudget.io/ we can estimate how big should be the size of our resources, to load a page within a specific time limit with a specific connection type. With the obtained data, we can create a JSON file to set these limits in Lighthouse, or to simply keep it as a reference.
How to reduce the weight of static resources
In general, besides removing everything that isn’t used, it is always recommended to prioritise critical resources shown in the above the fold, to paint what the user sees first; to cache resources in the browser using cache headers and a service Worker, as well as delaying the download of non-critical resources. This is the so-called PRPL pattern: read as purple, it stands for:
- Push (or preload) critical resources.
- Render the initial route as soon as possible.
- Pre-cache remaining assets.
- Lazy-load and create remaining (non-critical) routes on demand.
Another technique we can apply to all the resources to speed up our download is a CDN, to prevent high latency times, especially when our website is open to audiences from various countries.
For example, with webpack we have plug-ins https://www.purgecss.com/ to remove unused CSS, and https://github.com/anthonygore/html-critical-webpack-plugin to detect critical CSS. Using them takes some setting up, and generating the website’s code in HTML, or at least its main templates. However, the initial setting up efforts are worth it, due to the great gains in terms of performance we could obtain. Its execution should be automised as much as possible to avoid delaying work flow, and time dedicated to this configuration of the development environment must be included in the initial budget.
How to reduce image weight
With regard to image optimisation, besides the recommendations given earlier, here’s a few additional techniques:
- Use Chrome’s WebP format, to load images, which weight less when saved in this format. To load a picture in different formats and let the browser choose the most appropriate one, we can use the elements <picture> and <source>. We could also use the JPEG XR (.jxr) format of Microsoft Edge, and JPEG 2000 (.jp2) of Safari, although these two browsers don’t have a market share as big as Chrome. For example:
<picture> <source srcset="/example.webp" type="image/webp" /> <source srcset="/example.jxr" type="image/vnd.ms-photo" /> <source srcset="/example.jp2" type="image/jp2" /> <img srcset="/example.png" alt="SEO text" /> </picture>
How to reduce font weight
We’ve dedicated a whole other article to font optimisation. I’m just going to state here that one of the best font optimisation techniques is to apply subsetting, which consists in removing unused characters of the font. To do this, we can use an online tool like this one https://everythingfonts.com/subsetter or an offline one used via command line like https://github.com/fonttools/fonttools.
Of course we should also remove unused fonts and font variants.