DEV Community

Calvin Tran
Calvin Tran

Posted on

How do I deploy my machine learning model on local kubernetes

Kubernetes Kind


Being a data scientist / data engineer / data wizards …. we don’t want to spend so much our time and efforts to setup a development environment.

With the help of Kind (Kubernetes in Docker), we can run a single cluster kubernetes in docker and even deploy our model into it.

Why kind ?

Yes, why kind while we have other options: minikube, microk8s, kubeadm,..

  1. Kind using docker while minikube requires virtualbox which consume more resources

  2. Kind can run on macbook, ubuntu, linux, window. Basically, if you have docker, you can run kind while microk8s only supports ubuntu.

  3. In Kind, we have the cluster with single command

Installation kind on my local

Kind has available stable binaries. This is the easiest way to run kind on my local machine with CLI. According to the docs, we have to download the binary and and place in to PATH

$ curl -Lo ./kind$(uname)-amd64
$ chmod +x ./kind
$ mv ./kind /usr/local/bin/kind

Set up kubectl

We also need Kubernetes command-line tool, the installation can be found at the documentation (

Make sure kubectl is running

$ kubectl version

Once kind is ready, we run these commands to have kubeconfig in our environment

$ export KUBECONFIG=$(kind get kubeconfig-path)$ kubectl cluster-info

Hoooray !! everything is ready. Let’s jump to the deployment our first model.

Deploy iris model

I always use iris model as a “Hello World” example for every project. The deployment.yaml manifest have been prepared

apiVersion: apps/v1
kind: Deployment
  name: iris-app
    app: iris-app
    tier: backend
    version: v1
      app: iris-app
  replicas: 2
        app: iris-app
      - name: iris-app
        image: canhtran/iris-svm-model-api:latest
        - containerPort: 5000

Apply deployment to the cluster.

$ kubectl apply -f

deployment.apps/iris-app created

I create a service to expose the deployment.

$ kubectl expose deployment iris-app --type=LoadBalancer --name=iris-svc

service/iris-svc exposed

Finally, we forward the port to our local machine for testing.

$ kubectl port-forward svc/iris-svc 5000:5000

Forwarding from -> 5000
Forwarding from [::1]:5000 -> 500

Test with curl command.

$ curl -i -X POST -H "Content-Type:application/json" http://localhost:5000/iris/predict -d '{"payload":[6.2, 3.4, 5.4, 2.3]}'

HTTP/1.1 200 OK
Server: gunicorn/19.5.0
Date: Mon, 02 Sep 2019 02:51:04 GMT
Connection: close
Content-Type: application/json
Content-Length: 13


The service is up and running. Everything work perfectly.


In my opinion, KinD (Kubernetes in Docker) is the best alternative so far compare with minikube / microk8s. With a few commands, we’re able to have an environment to deploy our models.

Top comments (0)