Making my website faster
2024-01-13Since I started running this site in 2011, I’ve adhered to some principles to make it fast, cheap, and privacy-respecting.
- As few third-party cross-domain requests as possible – ideally, none.
- No trackers (which tends to follow naturally from the above).
- Use Javascript but don’t require it – the site should work just fine with it disabled, only with some features missing.
- No server-side code execution – everything is static files.
Out of respect for my readers who don’t have a fancy gigabit fiber internet connection, I test the website primarily on slower, high-latency connections – either a real one, or a simulated 2G connection using Firefox’s dev tools.
I was doing an upgrade of my httpd2 software recently and was frustrated at how long the site took to deliver, despite my performance optimizations in the Rust server code. To fix this, I had to work at a much higher level of the stack – where it isn’t about how many instructions are executed or how much memory is allocated, but instead how much data is transferred, when, and in what order.
On a simulated 2G connection, I was able to get load times down from 11.20 seconds to 3.44 seconds, and the total amount of data transferred reduced from about 630 kB to about 200 kB. This makes the site faster for everyone, whether you’re rocking gigabit fiber or struggling to get packets through.
In this post I’ll walk through how I analyzed the problem, and what changes I made to improve the site.