DEV Community

Cover image for Native images with Micronaut and GraalVM
Olivier Revial for Stack Labs

Posted on

Native images with Micronaut and GraalVM

After a Micronaut introduction we saw in a previous article how we can create a simple and powerful CLI application using Micronaut and Picocli, but our application was too slow for a CLI... it's time to take it to the next level by GraalVM native compilation support to this app to run a very (very) fast CLI application 🏎 !

Understanding native compilation

If you are used to the JVM environment, you might be wondering what native compilation is and how it differs from JVM compilation. Let's forget Micronaut for a while and focus on this.

JVM compilation

When we compile a JVM application (say a Java or Kotlin app for example), we usually package it as a JAR (Java ARchive), which is made of "bytecode", our well-known ".class" files. In order to run a Java application we have to run it on a Java Virtual Machine, i.e. a virtual machine that has been specifically developed to interpret our Java bytecode. The great thing about the JVM is that JVMs exist for all systems (Unix, Mac, Windows), meaning that as a developer you only need to care about compiling your app to a jar, and then the JVM will run it, no matter which system you're on. In short, the JVM idea comes from this 1995 slogan:

Write once, run anywhere

You can write and compile your application code on your Windows machine (using the Windows-compatible Java Development Kit) and deploy your jar to your Debian server (using the Unix-compatible Java Runtime Environment).

So far so good, isn't it ? Well, if it's working so well, why in earth do we need another compilation system for our apps ? Well, as always things are not black and white. JVM interoperability is great but comes at the cost: the cost of adding a layer between the operating system and the application: yes, you guessed it, the JVM itself. This cost is paid in terms of memory footprint and startup times. This is due to the JVM weight but also to the fact that most JVM applications heavily rely on runtime reflection, making the application startup slower and slower (and the memory footprint bigger and bigger) as the codebase grows.

Native compilation

Under this mysterious "native compilation" is actually a quite simple principle: you translate your source program into native code - i.e. code composed of instructions directly recognized by your CPU - using tools such as an assembler or a linker. And these are exactly the same tools used to make the "native applications" that run on your computer. The native code is therefore linked to a particular family of processors (and thus a family of operating systems) sharing the same set of instructions.
C language is a very good example of native compilation: when you run gcc what you do is compiling your code into an exe on Windows or an executable file on Unix systems:

> gcc -o helloworld helloworld.c
> ls -lah                                                                    

.rwxr-xr-x   12k olivier 23 May 19:39  helloworld
Enter fullscreen mode Exit fullscreen mode

Although the interoperability problem might be a concern, native compilation offers various advantages:

  • applications will start much faster
  • applications will have a much lower memory footprint

Also note that the interoperability becomes less of a problem with containers as we are now able to compile native code using a container (with a specific operating system) and run the native image in a very lightweight version of the same operating system (only basic system libs such as file or networks libs are needed).

Is the difference so big ?

Well, let's test it with this Hello World application:

class HelloWorld {
    public static void main(String[] args)
    {
        System.out.println("Hello World !");
    }

}
Enter fullscreen mode Exit fullscreen mode

Let's first compile it and run it with the JVM:

> javac HelloWorld.java
> time "java HelloWorld"

Benchmark #1: java HelloWorld
  Time (mean ± σ):      92.9 ms ±   9.0 ms    [User: 94.4 ms, System: 23.2 ms]
  Range (min … max):    81.5 ms … 122.1 ms    29 runs
Enter fullscreen mode Exit fullscreen mode

And now let's do the same compiling with GraalVM (more on that later):

> native-image HelloWorld
> time "./helloworld"

Benchmark #1: ./helloworld
  Time (mean ± σ):       5.8 ms ±  14.6 ms    [User: 1.6 ms, System: 1.4 ms]
  Range (min … max):     2.3 ms …  86.8 ms    33 runs
Enter fullscreen mode Exit fullscreen mode

Yes, even the most simple Hello World starts more than 10 times faster when compiled natively 🏎 !

ℹ️ I used hyperfine here instead of time to run a benchmark on commands

Do I need to run fast (and small) ?

Right now you may be asking yourself whether you need to run an app in tenth of milliseconds instead of a second.

I'll give the easy answer: it depends on your use case. If you are only deploying a few microservices a few times a month on big servers, you most probably don't care much about the speed, memory footprint or size of your application. On the other hand, there a few use cases where native images can be useful or even essential:

  • when deploying the app as a serverless function, time is of the essence as you want to reduce cold starts as much as possible.
  • when developing a CLI application (.e.g with Picocli), a very fast startup time is mandatory if you want to keep a good and enjoyable user experience.
  • when you care about the size of your app and its memory footprint. This is particularly true when deploying dozens of microservices packaged into containers onto a small cluster: the smaller the better. Native image, as they do not need a full JVM to run, are not contained on ~300MB JVM containers but rather in ~50MB containers.
  • when security is a concern. When it comes to security of your containerized applications, the less the better. As native images require less libraries to run (basically they just require a very basic system with only networking libs and such) instead of an heavy JVM, attack surface is reduced.

Micronaut and GraalVM

Now that we know what native image compilation is, let's see how it works on a real application. In this article we will reuse weather-cli application from previous article and add it native compilation support to see the differences between a JVM application and a native binary.

⁉️ What ?

The app I'm talking about is a weather CLI application that uses Micronaut and Picocli and interacts with Weatherbit.io API to provide the following commands:

  • weather command will return the current weather for a given location (city and country)
  • forecast command will return a forecast for the next 1 to 16 days for a given location (city and country)

❓ Why Micronaut ?

In a strict sense we don't need a specific framework to use GraalVM and compile an application down to a native binary. However Micronaut provides powerful features and out of the box support for native compilation ! That's the great thing about Micronaut : it doesn't use runtime reflection or proxies, making it an excellent candidate for compiling native images to run very (very) fast microservices !

Repository

Code sample for this article can be found in this Github repository.

Configuring our application for native compilation

GraalVM configuration

As mentioned earlier we will reuse an existing (Java) Micronaut app and add it native image support. Note however that if we were to create our native app from scratch we could use Micronaut mn command to generate our app skeleton to pre-configure our build with Picocli and GraalVM dependencies, as mentionned in our Micronaut 101 article:

> mn create-cli-app weather-cli --features graal-native-image,http-client
Enter fullscreen mode Exit fullscreen mode

The resulting build.gradle build file would have been similar to our previous build file expect for the added GraalVM dependency.

➡️ Let's add GraalVM dependency and Micronaut specific graal support to our build.gradle (in the dependencies block)

dependencies {
...
    annotationProcessor "io.micronaut:micronaut-graal"
    compileOnly "org.graalvm.nativeimage:svm"
...
}
Enter fullscreen mode Exit fullscreen mode

In reality adding graal-native-image feature to Micronaut command-line generation did more than just adding dependencies to the build and also added some needed configuration files for native compilation.

There is one file we need to add to our project because GraalVM needs it and will try to pick it up: native-image.properties.

➡️ Add a new file in a new directory under src/main/resources/META-INF/native-image/{package.name}/{application-name}/native-image.properties.

Note: In our case the file must be at the path src/main/resources/META-INF/native-image/weather.cli/weather-cli-application/native-image.properties.

➡️ Add the following content to this file:

Args = -H:IncludeResources=logback.xml|application.yml|bootstrap.yml \
       -H:Name=weather-cli \
       -H:Class=weather.cli.WeatherCliCommand
Enter fullscreen mode Exit fullscreen mode

Basically we tell GraalVM to include some resources files to our compiled application, and we tell it where to find the main class once the app has been compiled.

Alright, that's pretty much it for GraalVM configuration as far as Micronaut is concerned !

Picocli native configuration

Although our Micronaut/GraalVM configuration is correct, if we try to compile our app using native-image right now, it wouldn't compile (or would crash at runtime) because Picocli uses reflection at its core, however reflection is not possible on a GraalVM runtime application. In other words, we need a way to tell native-image tool that our Picocli application uses certain classes using reflection and thus that the tool should include them in the native binary.

GraalVM is able to read a file named reflection-config.json - that should be under the same META-INF directory as native-image.properties - that tells it what classes in our app are usually loaded using reflection, to make sure GraalVM includes them at compile time, and is able to know what fields and methods it should include.

The file for our app basically looks this:

[ {
  "name" : "weather.cli.commands.ForecastSubcommand",
  "methods" : [ {
    "name" : "setNbDays",
    "parameterTypes" : [ "int" ]
  }, {
    "name" : "run",
    "parameterTypes" : [ ]
  } ]
}, {
  "name" : "weather.cli.commands.LocalizedCommand",
  "fields" : [ {
    "name" : "spec"
  }, {
    "name" : "spec"
  }, {
    "name" : "spec"
  } ],
  "methods" : [ {
    "name" : "setCountry",
    "parameterTypes" : [ "java.lang.String" ]
  }, {
    "name" : "setCity",
    "parameterTypes" : [ "java.lang.String" ]
  }, {
    "name" : "run",
    "parameterTypes" : [ ]
  } ]
}, {
  "name" : "weather.cli.WeatherApplication",
  "methods" : [ {
    "name" : "main",
    "parameterTypes" : [ "java.lang.String[]" ]
  }, {
    "name" : "run",
    "parameterTypes" : [ ]
  } ]
}, {
  "name" : "weather.cli.commands.CurrentWeatherSubcommand",
  "methods" : [ {
    "name" : "run",
    "parameterTypes" : [ ]
  } ]
} ]
Enter fullscreen mode Exit fullscreen mode

😱 Beautiful isn't it ?

Well of course if you are as lazy as I am you are now sweating while realizing this file should be updated each time you had a Picocli-related class !

But don't worry, Picocli provides a tool named picocli-codegen to automatically include this reflection configuration file in our project build. Told you Picocli was awesome 🙃

➡️ Let's add Picocli annotation processor to our build.gradle, inside the dependencies block:

annotationProcessor 'info.picocli:picocli-codegen:4.0.3'
Enter fullscreen mode Exit fullscreen mode

➡️ We can now repackage our app:

>./gradlew assemble

...
Note: ReflectConfigGen writing to: CLASS_OUTPUT/META-INF/native-image/picocli-generated/reflect-config.json
Note: ResourceConfigGen writing to: CLASS_OUTPUT/META-INF/native-image/picocli-generated/resource-config.json
Note: ProxyConfigGen writing to: CLASS_OUTPUT/META-INF/native-image/picocli-generated/proxy-config.json
Enter fullscreen mode Exit fullscreen mode

Phew, Picocli just generated the reflection-config.json file for us ! What it did is basically just adding Picocli commands and subcommands as needed classes for native compilation.

Well, are we ready to compile our app down to a native image ? Well, actually we still have two problems.

The first resides in the fact that dependency injection will only be added in the reflection config when using constructor injection or public fields. In other words, our private injected WeatherAPIClient in CurrentWeatherSubcommand and ForecastSubcommand won't work while doing native compilation.

@Inject
private WeatherAPIClient weatherAPIClient;
Enter fullscreen mode Exit fullscreen mode

We have two choices to fix this: either make this field public or use constructor injection. Although I would always recommend the constructor injection, Picocli subcommands require a no-args constructor so we can't do that.

➡️ For simplicy, let's do it the ugly way (remember to do this in both subcommands!):

@Inject
public WeatherAPIClient weatherAPIClient;
Enter fullscreen mode Exit fullscreen mode

ℹ️ the actual good way of doing this would be to use a custom factory to be able to call an arg-constructor, but I'd rather keep the code simple here.

Our last problem resides in our mapping API beans whose constructor and methods won't be found by Jackson after native compilation, as in the example below:

Type [weather.cli.api.WeatherAPIClient$Intercepted] executed with error: Error decoding HTTP response body: Error decoding stream for type [class weather.cli.api.WeatherResponse]: Cannot construct instance of `weather.cli.api.WeatherResponse` (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)
Enter fullscreen mode Exit fullscreen mode

➡️ To solve this problem all we need to do is to annotate our four POJO with @Introspected annotation:

@Introspected
public class ForecastResponse {
    // ...
}
Enter fullscreen mode Exit fullscreen mode

➡️ Repeat the operation for the four beans: ForecastObservation, ForecastResponse, WeatherObservation, WeatherResponse.

Phew, our configuration is now ready, we can launch native compilation.

Native compilation

Native compilation is done in two steps:

➡️ First build the uber-jar as usual:

>./gradlew assemble
Enter fullscreen mode Exit fullscreen mode

➡️ Now launch native compilation using GraalVM native-image tool:

> native-image --no-server --class-path build/libs/weather-cli-0.1-all.jar                                           
[weather-cli:23973]    classlist:   6,031.94 ms,  1.52 GB
[weather-cli:23973]        (cap):   2,442.65 ms,  1.52 GB
[weather-cli:23973]        setup:   4,441.13 ms,  1.52 GB
[weather-cli:23973]   (typeflow):  67,023.02 ms, 10.51 GB
[weather-cli:23973]    (objects):  47,327.83 ms, 10.51 GB
[weather-cli:23973]   (features):   4,982.65 ms, 10.51 GB
[weather-cli:23973]     analysis: 123,953.65 ms, 10.51 GB
[weather-cli:23973]     (clinit):   1,745.52 ms, 10.71 GB
[weather-cli:23973]     universe:   5,704.32 ms, 10.71 GB
[weather-cli:23973]      (parse):  11,405.65 ms, 10.71 GB
[weather-cli:23973]     (inline):  19,854.90 ms, 11.61 GB
[weather-cli:23973]    (compile): 102,471.28 ms, 11.61 GB
[weather-cli:23973]      compile: 137,496.58 ms, 11.61 GB
[weather-cli:23973]        image:   9,296.04 ms, 11.61 GB
[weather-cli:23973]        write:   3,119.01 ms, 11.61 GB
[weather-cli:23973]      [total]: 290,640.34 ms, 11.61 GB
Enter fullscreen mode Exit fullscreen mode

As you can see native compilation takes quite a while to complete (almost 5 minutes on my machine), but if everything goes well you should see an output similar to the one above !

A few notes:

  • We run the command with --no-server because we don't need server-based image building
  • We pass our built jar as a build input using --class-path argument
  • We could have used Docker to build our native image instead of the native-image tool (Micronaut provides a docker-build.sh script) but I wanted to show the "hard-way" so you know what is actually going on 😉

And now, the best part of this article: time to run the binary app !

First let's verify that we have a binary:

> ls -lah

...
.rwxr-xr-x   50M olivier 23 Mar 18:38  weather-cli*
Enter fullscreen mode Exit fullscreen mode

As you can see, this 50MB file is our executable, standalone binary application, that needs no other dependency to be run (SubstrateVM is already included in this binary). Pretty cool isn't it ?

➡️ Time to run it:

# Run the app and ask for Montreal weather
> ./weather-cli --country CA --city montreal

Established active environments: [cli]
Asking weather for city montreal and country ca
=====================================
Current weather in montreal:
  - temperature: 1.9°C
  - wind speed: 11.268 km/h
  - cloud coverage: 100.0%
  - precipitation: 0.0 mm/h
=====================================

# Ask for Paris 3-days forecast
> /weather-cli forecast --city Paris --days 3

Established active environments: [cli]
Asking forecast for city paris and country fr
Using default city paris...
Using default country fr...
=====================================
Forecast in paris on day 2020-03-23:
  - average temperature: 7.8°C
  - min temperature: 4.2°C
  - max temperature: 10.8°C
  - wind speed: 12.081204 km/h
  - probability of precipitation: 0.0%
  - average cloud coverage: 0.0%
Forecast in paris on day 2020-03-24:
  - average temperature: 5.1°C
  - min temperature: 0.5°C
  - max temperature: 10.8°C
  - wind speed: 8.629308 km/h
  - probability of precipitation: 0.0%
  - average cloud coverage: 0.0%
Forecast in paris on day 2020-03-25:
  - average temperature: 4.0°C
  - min temperature: 0.1°C
  - max temperature: 8.7°C
  - wind speed: 9.142452 km/h
  - probability of precipitation: 0.0%
  - average cloud coverage: 0.0%
=====================================
Enter fullscreen mode Exit fullscreen mode

Of course the results are no different than previous outputs, but it's always good to see that our app is actually working when launched as a standalone binary !

Performance

Time to measure execution time and compare it to our previous JVM execution !

🥁🥁🥁 Drumroll please 🥁🥁🥁

> time "./weather-cli --help"

Benchmark #1: ./weather-cli --help
  Time (mean ± σ):      39.5 ms ±   5.8 ms    [User: 16.1 ms, System: 15.7 ms]
  Range (min … max):    32.0 ms …  57.8 ms    67 runs
Enter fullscreen mode Exit fullscreen mode

Wow, our response time just dropped from ~2 seconds down to an average 40ms. Pretty awesome right ?

ℹ️ Again I used hyperfine here instead of time to run a benchmark on the help command

Conclusion

In this article we saw the differences between JVM and native compilation. We implemented a real-world use-case to see how to actually use native compilation to compile an application down to a native image.

The main takeaway of this is that an application startup time and memory footprint can be dramatically improved by using native compilation instead of traditional JVM compilation, but also that it comes at a certain cost: not everything can be done the way it was when simply compiling for the JVM, and even when possible it can sometimes require an extra configuration step before an app can be run natively.

And the great thing is: frameworks and libraries such as Micronaut and Picocli remove the need for handling reflection configuration manually by providing a set of tools to automatically generate appropriate reflection files !

...beyond Micronaut

Micronaut is not the only one to propose a simplified way of writing complex apps that can be compiled down to native images with GraalVM support. Here a few examples of frameworks that support or partially support GraalVM native compilation:


References:

I can only recommend the great Micronaut documentation that provides most of the information you need. You can basically find it in two places:

Top comments (0)