DEV Community

Yhary Arias
Yhary Arias

Posted on

Docker Offload and the advantages of using it in IA projects

Docker Offload is the new release of Docker released this July 10, 2025, this new functionality allows us to offload the execution of containers remotely, taking advantage of the cloud without configuring complex environments.

Docker describes it as a way to run containers “as if it were local, but running on another machine”, all controlled from your own docker run as we have always done.

How can this be an advantage in an IA project?
In general we know that IA projects usually need:

  • Powerful GPU's to train or run models.
  • Lots and lots of RAM (otherwise the machine shuts down 😬)
  • Scalability and efficiency in model testing

With Docker Offload, you can:

  • Use your local machine to develop, but running models on a more powerful infrastructure.
  • Accelerate ML pipelines without manually mounting servers
  • Test giant models without crashing your machine

Let's get to the point! here are the advantages:

  • Simplicity: No need to set up complex clusters or remote servers.
  • Scalability: You can easily leverage cloud or external servers
  • Portability: You keep using the usual Docker ecosystem.
  • Ideal for AI testing and development without overloading your team 😎

Let's get to the good stuff, the practice:

🚀 Step by step: basic Docker Offload configuration.

Before you start, you need to have installed:

  • Docker Desktop updated (I recommend you to remove the app from your machine and reinstall it, to go fixed).
  • Docker Offload enabled (requires Docker account with beta program access).
  1. Enable Docker Offload

In Docker Desktop:

  • Go to Settings > Features in development
  • Enable the Docker Offload option
  • Restart Docker Desktop
  1. Let's follow the "quickstart" of the official Docker documentation
  • According to the documentation as of today (07/15/2025) we must open Docker Desktop and log in.
  • Now open the terminal of your machine (whatever it is) and run the following command:

$ docker offload start.

It should show you the following message:

Hit Enter if the account you are going to use is correct. Then it will ask you if you need GPU support. For this example we will say “yes”:

We are now ready to use Docker Offload! 🚀

Docker Offload will run on an instance with an NVIDIA L4 GPU, which is useful for machine learning or resource intensive workloads.

Now if you go to Docker Desktop you are going to see a cloud icon ☁️ in the header of the panel, this tells us that you have successfully enabled Docker Offload.

To check the status of Docker Offload let's run the following command:

$ docker offload status

Finally we are going to run a container with Docker Offload.

The documentation suggests the following example:

$ docker run --rm --gpus all hello-world

You should see the following after running the command:

We are doing quite well 🥳 now the following example is optional, so you can see how it works with another command creating a container in a simple way:

$
docker run --platform linux/amd64 python:3.11-slim python -c "print('Hola desde Docker Offload!')"

Wait ‼️ don't go away, we must stop the use of Docker Offload to avoid unnecessary consumption of resources in the cloud and return to a local environment by running the following command:

$ docker offload stop

🛠️ And that's it!
I hope this Docker Offload journey has saved you CPU, RAM... and headaches! 🧠💻

Love you guys, thanks for making it this far. Bye.
[Yhary Arias / @ia.fania]

Top comments (0)