Building Spring Boot 3 Native Image Container on Apple Silicon and ARM64 based Systems

In this blog post I will show to build native image container for Spring Boot 3 apps with GraalVM on Apple silicon or ARM64 based system (Oracle Cloud Ampere processor VM)

In my previous blog post, I have covered how to build native image for Spring Boot 3 based applications .

I have discussed about following 2 ways to build the native image

  1. Using Build Packs
  2. Using Native Image tools

We use following command when trying to build docker container with native image using buildpacks

mvn -Pnative spring-boot:build-image
Code language: Java (java)

If your run above command on Apple silicon or Oracle Ampere processor VM (ARM64 based) systems with default configuration you will see error like below.

[INFO] Building image 'docker.io/library/demo:0.0.1-SNAPSHOT' [INFO] [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 0% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 8% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 12% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 19% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 33% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 41% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 46% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 58% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 65% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 72% [INFO] > Pulling builder image 'docker.io/paketobuildpacks/builder:tiny' 100% [INFO] > Pulled builder image 'paketobuildpacks/[email protected]:2e3f8de582f878858b41fe62c3e891bdf242b48a36cfefd828a42ab40a039496' [INFO] > Pulling run image 'docker.io/paketobuildpacks/run:tiny-cnb' 0% [INFO] > Pulling run image 'docker.io/paketobuildpacks/run:tiny-cnb' 37% [INFO] > Pulling run image 'docker.io/paketobuildpacks/run:tiny-cnb' 100% [INFO] > Pulled run image 'paketobuildpacks/[email protected]:453f3298c5e8033e69d6250e2ed797fd10ae2e9493e7465733a164b4ea9343df' [INFO] > Executing lifecycle version v0.15.2 [INFO] > Using build cache volume 'pack-cache-5cbe5692dbc4.build' [INFO] [INFO] > Running creator [INFO] [creator] exec /cnb/lifecycle/creator: exec format error [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 45.376 s [INFO] Finished at: 2022-12-25T10:50:36Z [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:3.0.0:build-image (default-cli) on project demo: Execution default-cli of goal org.springframework.boot:spring-boot-maven-plugin:3.0.0:build-image failed: Builder lifecycle 'creator' failed with status code 1 -> [Help 1]
Code language: Java (java)

The reason for this error is default packeto buildpack image used by spring boot maven plugin does not support ARM64 architecture.

To overcome this problem, we need to use the buildpack image which supports ARM64 architecture if you are running above command on ARM64 based system.

We can build our own image which supports ARM64 architecture by following instruction from the post.

If you want to use pre built buildpacks images from community there are 2 options

If you are want to use dashaun image, change pom.xml configuration like below and the run the maven command.

<build> <plugins> ... <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration>  </configuration> </plugin> </plugins> </build>
Code language: Java (java)

If you want to use springdeveloper image, change pom.xml configuration like below and the run the maven command.

<build> <plugins> ... <plugin> <groupId>org.graalvm.buildtools</groupId> <artifactId>native-maven-plugin</artifactId> </plugin> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration>  </configuration> </plugin> </plugins> </build>
Code language: Java (java)

Note : for springdeveloper image, there is no image with latest tag, so we have to use specific image version.

Using Multi Arch buildpack

If your team are uses both AMD64(X86-64) and Apple silicon based system or If you are building based on Apple silicon but your build system is based on the AMD64(X86-64) architecture then we need to switch between buildpack images of different architectures.

The developer dashaun recently developed multi arch buildpack image which supports both AMD64(X86-64 and ARM64 architectures. So there is no need to switch between images or write any logic to enable the image based on the processor architecture.

To use the multipack image of buildpack, you need to change pom.xml configuration like below.

<build> <plugins> ... <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration>  </configuration> </plugin> </plugins> </build>
Code language: Java (java)

Now you can run maven command to build native image container without worrying about underlying architecture.

Issues

No ‘io.buildpacks.builder.metadata’ label found in image config labels

When I tried to build the native container image on Apple Silicon based on laptop with dashaun/java-native-builder-arm64 imageusing docker I have encountered following issue after pulling image from docker hub.

Execution default-cli of goal org.springframework.boot:spring-boot-maven-plugin:3.0.0:build-image failed: No 'io.buildpacks.builder.metadata' label found in image config labels
Code language: Java (java)

I could not find the root cause of the problem but once I switched to Podman instead of Docker the problem is gone.

Similar Posts