This is a common suprise when some people first convert their WP or other site to static.
If I run my original site and my static site in “Online pagespeed tool XYZ”, my score is lower - what’s up with that?
I’ve written on this a few times, but need somewhere on the new forum to quickly point people to rather than writing the same email multiple times.
A 100% pagespeed score is very achievable with any static site, ie:
However, they’re just numbers and having a perfect 100 on a page speed test means little if your site is still loading slower than you’d like or it’s only serving quickly to the region of your datacenter.
By going static, you open up to easily distributing your whole site over a globally fast CDN, such as AWS’ CloudFront, Cloudflare, or others. You’ll generally find their server response times much faster than your original server. Choosing where you host your DNS will speed things up, here AWS’ Route 53 or Clouflare are again very fast choices.
So, the new static site server such as those mentioned above are going to have faster server response times and ttfb, so why the lower page score?
Most commonly complained about by these pagespeed tools for my newly gone static sites are:
- cache control (the outdated Expires or max-age)
- compression (ie, gzip, Brotli)
The compression is usually easy to enable gzip or other compression types. The cache control may require whitelisting headers to be passed from S3 in the case of CloudFront or overriding using a Lambda@Edge function.
This is where I think page speed tools are a bit misleading, well, for the cache control.
These cache controls are usually for preventing a browser from downloading the same content twice in a certain amount of time.
What I mainly care about (your needs will vary), is the time it takes for a new user to see my site’s homepage or sub-page. For this, there’s never going to be any caching consideration, so I want it to be ready to serve to them from the fastest server, closest to them, wherever they may be in the world.
Another aspect of this is serving the smallest amount of data. Working to optimize a 1MB homepage to serve faster is somewhat polishing a turd. How much content can the human mind take in within 1 second and is your content not convincing enough without a bunch of custom fonts and stock line-art (blegh!).
No matter how fast your server and how close it is to the user, it’s still limited by their home/office/mobile bandwidth.
Your first impression is the one where speed matters the most, caching subsequent pageloads is a nice optimization, but less critical (for my use case, at least). When you’re serving massive traffic on your static site, then effective caching will also help save costs.
So, my advice to those wanting perfect page speed scores:
- do it, it’s fun
- actually check what each demerit point is talking about and fix it
- reduce your total pageload size, then reduce it some more
- remember that 1MB of static site is equivalent to 1MB of dynamic site for a single page load, don’t expect that to magically halve, but do expect your server response times to improve significantly, along with global speeds improving when using a leading edge hosting provider like CloudFront/Cloudflare
- caching is hard and you need to make your own decisions that suit your site
Last point - handling massive traffic
What these page speed tools also don’t illustrate at all, is how your server responds when there is more than 1 request being made to it. We’re often testing sites while in development/newly launched with little traffic.
One of the main benefits of a static site is that it will handle massive amounts of concurrent traffic. In the case of CloudFront/Cloudflare, anything but a serious DDoS effort should not see any change in pageload speeds for all your users.
Try running the wrk, JMeter or other performance testing tools against your original WordPress site and see how many concurrent users it can handle and at the requests per seconds it can maintain until it crashes. It’s usually a very pathetic number compared to the static version of your site, which will usually just run out of network and CPU bandwidth of the machine(s) you’re running the test from before the site itself slows down its responses.