In the previous blog, we discussed Large Language Models and Word Vectors and explained how they work and how they are connected.
In this blog, we will dive deep into Artificial Intelligence (AI) and how it impacts DevOps, preferably with real-world use cases. We will also provide a concise overview of ChatOps.
Generative Artificial Intelligence (GAI)
Generative AI (GAI) is a category of artificial intelligence focusing on producing new content, such as images, text, audio, and other forms of content. The central concept of Generative AI is to gain insights from previously trained data and autonomously generate fresh, never-seen-before content.
ChatGPT: A prime example of GenAI
ChatGPT is an example (or a subset) of Generative AI, which generates text for conversational means with humans. Fundamentally, “GPT” in ChatGPT stands for Generative Pre-Trained Transformer, which is a Large Language Model (LLM) capable of making coherent and pertinent text content based on a specific context given. The “Pre-Trained” means that the model has been trained on vast data sets before being fine-tuned to grasp the nuances of human language.
The word vectors provide the foundational understanding of language, while the LLM architecture, including the Transformer model, brings the ability to process and generate language in context. It's this combination that enables Generative AI to produce content that is not only new but also highly relevant and engaging.
What truly distinguishes Generative AI tools like ChatGPT is the synergy of their key components:
Word Vectors: These form the basis for understanding language, where words are represented as numerical vectors to capture their meanings and relationships in different contexts.
Large Language Model Architecture: This framework, specifically the Transformer model in ChatGPT, processes input and generates output, handling sequential language data efficiently.Transformer Model: A vital component of the LLM architecture, it uses mechanisms like attention to understand and generate responses that are not only coherent but also relevant to the context.
Pre-training and Fine-Tuning: These training processes equip ChatGPT with a broad understanding of language (through pre-training) and the ability to specialize in certain types of queries or tasks (through fine-tuning).
The question remains: how does AI impact DevOps engineers?
Let us cover this topic further to understand how AI helps speed up your DevOps tasks (like creating an installation script or debugging a pipeline YAML file).
Role of AI in DevOps
AI has revolutionized numerous fields, with DevOps being a prominent example. Its role ranges from generating and optimizing infrastructure code to automating CI/CD pipeline(writing YAMLs or tekton pipelines) creation for seamless infrastructure, helps in debugging complex code, and streamlines documentation processes like writing comprehensive READMEs for new repositories. AI-driven analytics and automated code reviews enhance efficiency and reduce errors in deployment processes.
Let’s discuss use cases on how AI impacts DevOps space.
Implementations of AI in DevOps
- Writing Infrastructure as Code (IaC): AI tools like ChatGPT, Pulumi AI, and Amazon Q (a Generative AI assistant designed to gain insights, solve problems, and generate content catering to business-specific needs) can help generate Infrastructure as Code (IaC) for DevOps teams to provision infrastructure in cloud environments like AWS, Azure or GCP.
- Creating Kubernetes Manifest: ChatGPT can automate the generation of YAML files for Kubernetes deployments, services, pod replicas, etc., based on specific requirements.
For example, an engineer can describe the desired deployment state, including the number of replicas, container images, and resource limits, and ChatGPT can generate the YAML manifest file.
- Automating CI/CD workflows: GitHub Actions are crucial for CI/CD pipelines; therefore, ChatGPT can assist by generating workflow YAML based on the specificity of your project.
For instance, engineers can prompt ChatGPT to generate a GitHub Actions CI/CD workflow that automates the process of creating an S3 bucket through Terraform.
- Infrastructure Security: Open Policy Agent (OPA) is used for policy-based control across the Kubernetes cluster or your cloud-based infrastructure. AI, like ChatGPT, can generate an OPA policy file based on your specific security requirements.
For example, a prompt to ChatGPT can be given, such as checking if public access is enabled for an S3 bucket. ChatGPT can accordingly write the OPA policy in Rego (the language syntax OPA uses) for the above use case.
Can you use AI in DevOps today?
Despite the promise of Generative AI tools like ChatGPT and Google Bard, generating an image or text is very different from giving access to internal production-related platforms, especially for models that have been trained on generic data as opposed to your specific architecture.
Let’s dive deeper into some of those concerns:
Lack of Organization Context: No matter how much you train the model, it won't grasp the specific organizational context, such as naming conventions, mandatory tags, security policies, developers' IaC writing styles, or standard templates and modules used by the organization.
Non-Deterministic Output: Small prompt changes can lead to different outcomes, and so do updates to the LLM. When dealing with infrastructure, you want predictable outcomes, given the nature of the actions.
Integration with tools: AI tools like ChatGPT cannot integrate with essential tools required in DevOps like Github, Jenkins, Kubernetes, or Dockerhub.
Security concerns: Given that tools like ChatGPT are trained on extensive amounts of data, they risk exposing sensitive data. For example, a developer might give a prompt to ChatGPT containing logs that might contain Personally Identifiable Information (PII). This raises privacy concerns as this sensitive data would get further trained, posing more significant security risks for the organization.
No Permissions System or Audit Trail: ChatGPT does not provide a system where you can enhance security through RBAC roles and delegate access to users, and it does not provide audit trails. For example, ChatGPT cannot identify suspicious activities in an actual AWS account and trace evident data or the root cause of issues.
Alternative AI in DevOps tools
One tool that I found to solve many of these challenges is Kubiya.ai (though there might be others out there).
Kubiya.ai is an AI-powered ChatOps tool that leverages the power of Large Language Models (LLM) and serves engineering teams to handle all their DevOps requirements. Think of it as a fine-tuned OpenAI GPT with “GPTstor” (storage or repository of GPT-generated content) for development, infrastructure, and technical operations.
By automating DevOps processes, Kubiya solves ChatGPT's shortcomings. Kubiya streamlines DevOps processes such as creating IaC, adhering to organizational standards, generating GitHub Actions workflow for CI/CD, deploying Kubernetes workloads, and managing the deployments (like rolling back the deployment, scaling the deployment, etc).
Let us see this in action in accordance with the above-mentioned point.
We gave a prompt to ChatGPT for providing the Terraform IaC to provision an S3 bucket in AWS.
It's great that it gave us an appropriate Terraform code, but now, if we prompt it to run the same code,
As evident, it provides the steps to run the Terraform commands and achieve the output but never actually provisions the S3 bucket.
Kubiya can automate this exact same process with just a conversation.
Now, Let’s see how Kubiya is helpful where tools like ChatGPT fail:
Primary function
Kubiya is a conversational AI DevOps assistant that automates various DevOps tasks. It can manage Kubernetes deployments and, like deploying Kubernetes manifests in EKS clusters or assigning the right agents to specific users based on their roles.
ChatGPT, on the other hand, is for a general audience and only focuses on generating code snippets and answering queries.
Integration with external tools:
- Kubiya can integrate with a suite of developer tools like Jenkins, Github, Kubernetes, JIRA, and cloud providers like AWS. For example, after integrating Kubiya with JIRA, users can create tickets, update ticket statuses, and manage ticket assignments through Kubiya.
- ChatGPT cannot integrate with any such DevOps tools.
Interaction:
- The simplest approach to start with Kubiya is to use direct integration within Slack, which offers a simplified and better user experience and allows teams of engineers to interact with the virtual assistant without switching platforms.
- ChatGPT is a Generative AI tool on its own that does not integrate with communication platforms and can only be used in its specific UI.
Infrasity curated this content as a dedication to empowering knowledge and contributing to the community in the field of DevOps.
Top comments (1)
Great blog! I really appreciate how you explored AI in DevOps, especially the use of tools like ChatGPT for automating Infrastructure as Code and CI/CD workflows. I recently read another fantastic blog on this topic that highlighted how solutions like Kubiya.ai can effectively address challenges with traditional AI models. Exciting times ahead for AI in DevOps!