DEV Community

shah-angita for platform Engineers

Posted on

Serverless Workloads on Kubernetes with Knativ

Kubernetes has become a standard tool for container orchestration, providing a set of primitives to run resilient, distributed applications. However, managing the underlying infrastructure can be time-consuming. The serverless paradigm helps users deploy applications without worrying about the infrastructure. Knative, a Kubernetes-based platform, provides components to deploy and manage serverless workloads, offering open-source Kubernetes integration, cloud-agnosticism, building blocks, and extensibility.

Knative Components

Knative features two main components: Eventing and Serving. Eventing manages events that trigger serverless workloads. Serving is a set of components to deploy and manage serverless workloads. Knative Serving enables developers to deploy and manage serverless applications on top of Kubernetes, allowing for quick and easy deployment of new services, scaling, and connection to other services and event sources.

Deploying Serverless Workloads with Knative

To deploy a serverless workload on Knative, you must create a Service resource. This can be achieved using either the Knative CLI (kn) or the kubectl command line tool for applying YAML files to your Kubernetes cluster.

Using the Knative CLI

To use the Knative CLI, you need to install it first. You can download the latest version of the Knative CLI binary and set it up as follows:

wget https://github.com/knative/client/releases/download/knative-v1.8.1/kn-linux-amd64
mv kn-linux-amd64 kn
chmod +x kn
cp kn /usr/local/bin/
Enter fullscreen mode Exit fullscreen mode

Verify the installation by running:

kn version
Enter fullscreen mode Exit fullscreen mode

Creating a Service Resource

Once you have the Knative CLI set up, you can create a Service resource using the following command:

kn service create <service-name> --image <image-name>
Enter fullscreen mode Exit fullscreen mode

This command will create a new Service resource and automatically create a new Revision and Route for the service. The Revision is a point-in-time snapshot of your workload, and the Route assigns traffic to the Revision.

Using kubectl

Alternatively, you can use kubectl to create a Service resource by applying a YAML file to your Kubernetes cluster. Here is an example YAML file:

apiVersion: serving.knative.dev/v1
kind: Service
metadata:
  name: <service-name>
spec:
  template:
    spec:
      containers:
      - image: <image-name>
Enter fullscreen mode Exit fullscreen mode

Apply the YAML file using the following command:

kubectl apply -f service.yaml
Enter fullscreen mode Exit fullscreen mode

Platform Engineering and Continuous Deployment

Knative can be integrated with continuous deployment tools like ArgoCD to automate the deployment of serverless workloads. ArgoCD follows the GitOps pattern, allowing developers to control application updates and infrastructure setup from a unified platform. This integration enables developers to focus on the functionality of the service, abstracting infrastructure issues like scaling and fault tolerance.

Conclusion

Knative provides a robust platform for deploying and managing serverless workloads on Kubernetes. By using Knative, developers can focus on writing code without worrying about the underlying infrastructure. The platform's components, such as Eventing and Serving, enable efficient management of serverless applications. With the ability to integrate with continuous deployment tools, Knative simplifies the process of deploying and managing modern, cloud-native applications.

Top comments (0)