Improve Page Speed With GZIP Or Brotli Compression
This article takes at HTTP text compression, a simple but effective way to reduce your page weight and improve page load time. We’ll also compare the twomost common text compression algorithms, GZIP and Brotli .
Text compression algorithms reduce the size of text files. That way these files take less bandwidth to transfer and less disk space to store.
The HTTP protocol is used to transfer data between website servers and website visitors. Servers can send a compressed response to the browser and use the Content-EncodingHTTP header to indicate what compression algorithm has been used. The browser can then decompress the file in order to display the web page.
The request waterfall below shows an example of GZIP compression in action. The firstrequest, the HTML document, loads a 354 kilobyte file. However, thanks to GZIP compression, only 68 kilobytes are used for the transfer. As a result the download (shown as the blue area of the request bar) only takes about 150 milliseconds .
The "Full Size" column shows the original file size, the "Size" column shows the amount of data that was transferred over the network.
GZIP has been around since 1992and is very widely supported. However, the Brotli algorithm that was released in 2013 provides better compression and should be used whenever possible.
Compression File Size None 173
KBGZIP 61 KB Brotli 52 KB
GZIP achieves a file size reduction of 65%, but Brotli goes further and saves 70% of the original file size.
How well supported is Brotliby browsers? The Can I Use website estimates support at 96% . Notably IE11, also released in 2013 , does not support Brotli .
Luckily, browsers send an accept-encoding HTTP header when requesting text files. That way your server can use Brotlicompression if the client supports it and fall back to GZIP compression when that’s not the case.
What you need to do to set up text compression depends heavily on what your technical server setup looks like.
If you use a Content Delivery Networkcompression is often applied automatically or there is a configuration setting you can use.
If you use web server software like NGINX then you can apply text compression there.
Finally, you can apply compression directly in your application code, for example using the compression module in Node.JS.
To check the compression used for requests made when loading your website you can look at the relevant response headers and content sizes in different tools.
You can run a free website speed test and then check the Requeststab for more information on GZIP or Brotli compression.
To see the compression algorithm for all requests use the Columns selector and toggle the Content Encodingoption. Brotli shows up as "br" in the content encoding column.
In the Overview tab there’s also a " Compresstext files" recommendation that automatically assesses whether text compression is used when loading your website.
Finally, Google‘s Lighthouse tool contains an automatic text compression audit.
Lighthouseautomatically picks up requests where compression could help and explains why it’s useful:
Text-based resources should be served with compression (gzip, deflate or brotli) to minimize total network bytes.
You can use the Networktab in Chrome DevTools to see what compression algorithm is used on your website and how much it helps reduce page weight.
Open DevTools by right-clicking on the page and selecting Inspect Select the Networktab Reload the page
Then, to show the compression algorithm, right-click on oneof the column headers, select Response Headers and then Content-Encoding.
By default the DevToolssize column only shows the transferred/compressed size. To also view the full uncompressed size you can:
Click the gear icon in the top right (the one under the "x", not the one to left of it) Enable Big request rows
With this setting enabled the Sizecolumn shows the transferred size at the top and the full uncompressed size at the bottom.
Data compression reduces data volume, but it comes with onedownside: compressing and decompressing data requires CPU processing time.
Compression algorithms provide a compression level option to balance this trade-off. For example, gzip provides compression levels between 1 and 9, where 1 is the fastest and 9 achieves the most compaction. The default setting is 6 .
The compression level estimator provides a good visualization showing how different settings impact the overall compression ratio.
However, usually the default compression level will be good enough as increasing the level doesn’t result in massive savings.
No, not all data should be transferred using HTTP compression. Specifically, gzip or Brotlidon’t help when transferring images. That’s because image formats already contain compression algorithms that are optimized specifically for images.
Text compression is an essential part of page speed optimization. A fast website will deliver a better user experience and help you rank higher in Google, as the Core Web Vitals metrics are a Google ranking signal.
Working to improve your page speed? Try DebugBear for free to monitor your website performance over time and get targeted recommendations to optimize it.