DEV Community

Dang Hoang Nhu Nguyen
Dang Hoang Nhu Nguyen

Posted on

[BTY] Day 7 & 8: Use NVIDIA Docker Containers to deploy ML models

As you may know, The host’s NVIDIA GPU does not expose to your containers by default. So, you have to work around a little bit to make your models interface with your GPU. Eventually, the process of deploying your machine learning frameworks to production will be easier and more effective.

Here are the main articles I have passed through when I tried to dockerize my app with GPU support. In that journey, I encountered many troubles that were not mentioned in those articles. I searched the solution for each problem from many discussion threads I had found on the internet. It mainly depends on your architecture, your ML frameworks, your host machine, ... I will release an official blog soon.

  1. How to Use the GPU within a Docker Container: link
  2. How to Use an NVIDIA GPU with Docker Containers: link
  3. Complete guide to building a Docker Image serving a Machine learning system in Production: link
  4. CUDA + Docker = ❤️ for Deep Learning: link

Deployment Environment:

  • Ubuntu: 20.04
  • Graphic card: GTX 3090
  • Python: 3.7
  • Torch: 1.8
  • CUDA: 11.1
  • CUDNN: 8.2.1

Heroku

Simplify your DevOps and maximize your time.

Since 2007, Heroku has been the go-to platform for developers as it monitors uptime, performance, and infrastructure concerns, allowing you to focus on writing code.

Learn More

Top comments (0)

Billboard image

Deploy and scale your apps on AWS and GCP with a world class developer experience

Coherence makes it easy to set up and maintain cloud infrastructure. Harness the extensibility, compliance and cost efficiency of the cloud.

Learn more

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay