DEV Community

Cover image for AI-Powered Kubernetes Troubleshooting: Ollama + Kubewall
Abhimanyu Sharma
Abhimanyu Sharma

Posted on • Edited on

AI-Powered Kubernetes Troubleshooting: Ollama + Kubewall

Managing Kubernetes can be powerful—but also overwhelming.
kubewall makes it simple with a single-binary, browser-based dashboard for real-time cluster monitoring and management.


Why to use ollama + kubewall ?

Quick Answer: Security and Easy to Configure

Ollama and kubewall runs 100% locally on your system preventing any security leaks and safe-guard your cluster secrets and other config maps.
Using this you are tied to any 3rd party servers, it will be your server - your infra - your system - your ai model

kubewall-ai-kubernetes-dashboard


Prerequisites


Install

  1. Download and install ollama, that will help you run AI models locally based on your OS type ( mac/linux/windows ). It can be downloaded from here https://ollama.com/download
  2. We will add qwen3 model to our system that ollama can access.

    • We are using qwen3 in this case, since it support thinking + coding + tools
    • open your terminal and run this command
          ollama run qwen3
    
  3. Install kubewall based on your system type. Visit kubewall Read Me More Guide

    • For Mac brew install --cask kubewall/tap/kubewall
    • For linux sudo snap install kubewall
    • For Windows winget install --id=kubewall.kubewall -e
    • If you like to download binary directly you can visit release page. List of binary

Investigate cluster using ollama + kubewall

  • Launch Kubewall Open your terminal and start Kubewall:
   kubewall
Enter fullscreen mode Exit fullscreen mode
  • Choose Your Cluster From the dashboard, select the Kubernetes cluster you want to connect to.

kubewall-choose-clusters

  • Open AI Settings Click the AI Settings panel in the upper-right corner.

ai-kubewall-settings

ai-providers-list-kubewall

  • Select AI Provider
    From the dropdown, choose Ollama as your provider.

  • Set URL-API-Key Key
    Enter your ollama as KEY and URL as http://127.0.0.1:11434/v1 (for most setups, the default key will work).

  • Pick qwen3. Model

aichat-kubewall

  • Start Chatting Chat with your local AI model—ask questions about your cluster, troubleshoot issues, or request optimization tips.

ai-chat-kubewall

Top comments (0)