Hack the AI Stack: Securing Real AI Workloads on Kubernetes 🔐🤖
Tuesday, 24 February, 2026 at 4pm GMT / 11am EST | 1 HOUR | ONLINE
AI is officially part of the production stack — and that changes everything.
Modern teams are shipping LLM-powered workloads fast, but the supply chains behind those systems are messy, opaque, and easy to exploit. Models, containers, dependencies… they all introduce new attack paths that most pipelines weren’t designed to handle.
That’s why Cloudsmith and Chainguard are teaming up for a hands-on, virtual hackathon where you’ll build, break, and secure a real AI workload running on Kubernetes.
What you’ll actually do
This isn’t a slide deck. You’ll get your hands dirty:
- Deploy and attack real AI workloads in Kubernetes
- Work with LLM tooling like Ollama and Hugging Face
- See how AI supply chains break in practice, not theory
- Secure models, containers, and dependencies before they reach production
You’ll use Cloudsmith to ingest, verify, quarantine, and promote AI artifacts across environments, while Chainguard’s hardened images and libraries eliminate entire classes of risk before workloads ever hit runtime.
Who should join?
If you’re:
- Building or operating AI/LLM workloads
- Running Kubernetes in production
- Thinking about supply chain security, provenance, and trust
- Curious how attackers actually target AI systems
…this one’s for you.
Why this matters
AI systems are non-deterministic, fast-moving, and increasingly automated. Traditional security controls don’t cut it anymore. This hackathon shows how modern teams embed trust, verification, and security from source to production — without slowing developers down.
Bring your laptop. Expect to break things. Leave knowing how to ship AI workloads with confidence.
👉 Spots are limited — sign up and hack the AI stack with us.
Top comments (0)