DEV Community

Jens Oliver Meiert
Jens Oliver Meiert

Posted on

On the Need for Neutral Maintained Minifier Metrics

To improve production performance, we use minifiers. For HTML, there are several HTML minifiers, with most being capable of minifying CSS and JavaScript as well.

While we can measure how effective and fast these minifiers are, there are two problems:

  1. As they’re not all based on the same languages and don’t all offer the same interfaces, minifiers don’t all work the same. That is, they can be tricky to compare 1:1.

  2. While interestingly, pretty much every minifier provides metrics on how they compare (see the original HTML Minifier, HTML Minifier Terser, HTML Minifier Next, htmlnano, minify-html, &c.), these metrics are often inaccurate: Assuming no bias, they’re not always based on the latest minifier version and their latest capabilities.

What you can tell from this is that problem 1) is a general challenge—one that has led Jens to several reviews and improvements to the benchmarks shared with HMN, and that Kirill knows with his benchmarks overview as well—, and that 2) is an actual problem.

2) is a problem because unmaintained benchmarks don’t offer an accurate picture of the minifier landscape. Accordingly, they don’t actually enable you to make good decisions.

The Call for Neutral and Maintained Minifier Benchmarks

What we believe we need is this:

  • A neutral steward—person or organization—who regularly publishes updated minifier benchmarks.

    • Minifier maintainers could support this steward by informing about new features. (Assuming automated dependency management, that’s not necessary for general minifier updates.)
  • The benchmarks should

    • be open and include as many HTML minifiers as possible
    • include a large and diverse range of sites to be tested
    • cover minification results and processing time
    • in the case of HTML minifiers, consider comparing both HTML minification and “full” minification (enabling all other minification options these minifiers come with)
    • be updated at least every month

This is probably not all: As a community, we can all contribute to making this process better.

The point is that as developers and minifier providers, we believe neutral, maintained minifier metrics are necessary when we’re actually in the great position of having several maintained minifiers that, excitingly, also apply different philosophies.

Even if minifier maintainers do their best to provide unbiased, up-to-date comparisons, having a steward who takes care of this could provide a lot more value to all of us.

Would this steward be you? Comment here or contact Jens or Kirill so that we’re aware and can help you get something started!

Top comments (0)