<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Jasdeep Singh Bhalla</title>
    <description>The latest articles on DEV Community by Jasdeep Singh Bhalla (@jasdeepsinghbhalla).</description>
    <link>https://dev.to/jasdeepsinghbhalla</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jasdeepsinghbhalla"/>
    <language>en</language>
    <item>
      <title>What is a Vector Database?</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Tue, 17 Mar 2026 16:00:00 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/what-is-a-vector-database-10e1</link>
      <guid>https://dev.to/jasdeepsinghbhalla/what-is-a-vector-database-10e1</guid>
      <description>&lt;p&gt;𝐌𝐨𝐬𝐭 𝐝𝐚𝐭𝐚𝐛𝐚𝐬𝐞𝐬 𝐬𝐞𝐚𝐫𝐜𝐡 𝐟𝐨𝐫 𝐞𝐱𝐚𝐜𝐭 𝐦𝐚𝐭𝐜𝐡𝐞𝐬.&lt;/p&gt;

&lt;p&gt;But AI doesn’t work like that.&lt;/p&gt;

&lt;p&gt;💡 𝐓𝐡𝐚𝐭’𝐬 𝐰𝐡𝐞𝐫𝐞 𝐕𝐞𝐜𝐭𝐨𝐫 𝐃𝐚𝐭𝐚𝐛𝐚𝐬𝐞𝐬 𝐜𝐨𝐦𝐞 𝐢𝐧.&lt;/p&gt;

&lt;p&gt;👉 In AI, text (like a sentence or document) is converted into numbers called 𝐞𝐦𝐛𝐞𝐝𝐝𝐢𝐧𝐠𝐬.&lt;/p&gt;

&lt;p&gt;These embeddings capture the meaning of the text. So instead of searching for exact words…&lt;/p&gt;

&lt;p&gt;👉 𝐕𝐞𝐜𝐭𝐨𝐫 𝐝𝐚𝐭𝐚𝐛𝐚𝐬𝐞𝐬 𝐬𝐞𝐚𝐫𝐜𝐡 𝐟𝐨𝐫 𝐬𝐢𝐦𝐢𝐥𝐚𝐫 𝐦𝐞𝐚𝐧𝐢𝐧𝐠.&lt;/p&gt;

&lt;p&gt;🧠 𝐒𝐢𝐦𝐩𝐥𝐞 𝐢𝐝𝐞𝐚:&lt;br&gt;
Normal database → finds exact matches&lt;br&gt;
Vector database → finds similar meaning&lt;/p&gt;

&lt;p&gt;📌 𝐄𝐱𝐚𝐦𝐩𝐥𝐞:&lt;br&gt;
Search: “𝘏𝘰𝘸 𝘵𝘰 𝘧𝘪𝘹 𝘭𝘰𝘨𝘪𝘯 𝘪𝘴𝘴𝘶𝘦”&lt;/p&gt;

&lt;p&gt;Vector DB can return:&lt;br&gt;
• “𝘢𝘶𝘵𝘩𝘦𝘯𝘵𝘪𝘤𝘢𝘵𝘪𝘰𝘯 𝘦𝘳𝘳𝘰𝘳 𝘴𝘰𝘭𝘶𝘵𝘪𝘰𝘯”&lt;br&gt;
• “𝘵𝘳𝘰𝘶𝘣𝘭𝘦𝘴𝘩𝘰𝘰𝘵𝘪𝘯𝘨 𝘴𝘪𝘨𝘯-𝘪𝘯 𝘱𝘳𝘰𝘣𝘭𝘦𝘮𝘴”&lt;/p&gt;

&lt;p&gt;Even if words don’t match exactly ✅&lt;/p&gt;

&lt;p&gt;🚀 𝐖𝐡𝐲 𝐢𝐭 𝐦𝐚𝐭𝐭𝐞𝐫𝐬:&lt;br&gt;
• Powers RAG (Retrieval-Augmented Generation)&lt;br&gt;
• Used in AI search &amp;amp; chatbots&lt;br&gt;
• Helps AI understand context&lt;br&gt;
• Core building block for AI agents&lt;/p&gt;

&lt;p&gt;🔧 𝐏𝐨𝐩𝐮𝐥𝐚𝐫 𝐕𝐞𝐜𝐭𝐨𝐫 𝐃𝐚𝐭𝐚𝐛𝐚𝐬𝐞𝐬:&lt;br&gt;
• Pinecone&lt;br&gt;
• Weaviate&lt;br&gt;
• FAISS (by Meta)&lt;br&gt;
• Milvus&lt;br&gt;
• Qdrant&lt;br&gt;
• Chroma&lt;/p&gt;

&lt;p&gt;If you're building AI systems today, vector databases are not optional—they’re foundational.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>rag</category>
      <category>vectordatabase</category>
      <category>programming</category>
    </item>
    <item>
      <title>What is RAG?</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Tue, 17 Mar 2026 08:59:20 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/what-is-rag-34d0</link>
      <guid>https://dev.to/jasdeepsinghbhalla/what-is-rag-34d0</guid>
      <description>&lt;p&gt;𝐌𝐨𝐬𝐭 𝐀𝐈 𝐦𝐨𝐝𝐞𝐥𝐬 𝐝𝐨𝐧’𝐭 𝐚𝐜𝐭𝐮𝐚𝐥𝐥𝐲 “𝐤𝐧𝐨𝐰” 𝐲𝐨𝐮𝐫 𝐝𝐚𝐭𝐚.&lt;/p&gt;

&lt;p&gt;They generate answers based on what they were trained on —&lt;/p&gt;

&lt;p&gt;which means they can be:&lt;/p&gt;

&lt;p&gt;• outdated&lt;br&gt;
• incorrect&lt;br&gt;
• or missing context&lt;/p&gt;

&lt;p&gt;💡 𝐓𝐡𝐚𝐭’𝐬 𝐰𝐡𝐞𝐫𝐞 𝐑𝐀𝐆 (𝐑𝐞𝐭𝐫𝐢𝐞𝐯𝐚𝐥-𝐀𝐮𝐠𝐦𝐞𝐧𝐭𝐞𝐝 𝐆𝐞𝐧𝐞𝐫𝐚𝐭𝐢𝐨𝐧) 𝐜𝐨𝐦𝐞𝐬 𝐢𝐧.&lt;/p&gt;

&lt;p&gt;👉 Instead of answering directly, RAG works in 3 steps:&lt;br&gt;
Search → Find relevant information (docs, PDFs, databases)&lt;br&gt;
Retrieve → Pull the most useful pieces&lt;br&gt;
Generate → Answer using that information&lt;/p&gt;

&lt;p&gt;🧠 Simple idea:&lt;br&gt;
Normal AI → guesses&lt;br&gt;
RAG → looks up info first, then answers&lt;/p&gt;

&lt;p&gt;📌 Example:&lt;br&gt;
Ask: “𝘞𝘩𝘢𝘵’𝘴 𝘰𝘶𝘳 𝘤𝘰𝘮𝘱𝘢𝘯𝘺’𝘴 𝘭𝘦𝘢𝘷𝘦 𝘱𝘰𝘭𝘪𝘤𝘺?”&lt;/p&gt;

&lt;p&gt;Without RAG → generic answer ❌&lt;br&gt;
With RAG → pulls actual company document ✅&lt;/p&gt;

&lt;p&gt;🚀 Why it matters:&lt;br&gt;
• More accurate answers&lt;br&gt;
• Uses real-time/private data&lt;br&gt;
• Reduces hallucinations&lt;br&gt;
• Powers AI agents &amp;amp; copilots&lt;/p&gt;

&lt;p&gt;𝐈𝐟 𝐲𝐨𝐮’𝐫𝐞 𝐛𝐮𝐢𝐥𝐝𝐢𝐧𝐠 𝐀𝐈 𝐭𝐨𝐝𝐚𝐲, 𝐜𝐡𝐚𝐧𝐜𝐞𝐬 𝐚𝐫𝐞 𝐲𝐨𝐮’𝐫𝐞 𝐮𝐬𝐢𝐧𝐠 𝐑𝐀𝐆 — 𝐞𝐯𝐞𝐧 𝐢𝐟 𝐲𝐨𝐮 𝐝𝐨𝐧’𝐭 𝐫𝐞𝐚𝐥𝐢𝐳𝐞 𝐢𝐭.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>llm</category>
      <category>rag</category>
      <category>programming</category>
    </item>
    <item>
      <title>🐳 10 Important Docker Terms Every Developer Should Know</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Sun, 15 Mar 2026 09:18:32 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/10-important-docker-terms-every-developer-should-know-3k1i</link>
      <guid>https://dev.to/jasdeepsinghbhalla/10-important-docker-terms-every-developer-should-know-3k1i</guid>
      <description>&lt;p&gt;Docker has become one of the most important tools in modern software&lt;br&gt;
development. It allows developers to package applications with their&lt;br&gt;
dependencies and run them consistently across environments.&lt;/p&gt;

&lt;p&gt;If you're getting started with Docker, understanding a few key concepts&lt;br&gt;
can make the learning curve much easier.&lt;/p&gt;

&lt;p&gt;Here are &lt;strong&gt;10 essential Docker terms every developer should know.&lt;/strong&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  📦 1. Container
&lt;/h2&gt;

&lt;p&gt;A &lt;strong&gt;container&lt;/strong&gt; is a lightweight, portable environment that runs an&lt;br&gt;
application along with everything it needs.&lt;/p&gt;

&lt;p&gt;This includes: - application code - runtime - system libraries -&lt;br&gt;
dependencies&lt;/p&gt;

&lt;p&gt;Containers isolate applications from the host system so they run&lt;br&gt;
consistently across environments.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run nginx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Learn more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/get-started/docker-concepts/the-basics/what-is-a-container/" rel="noopener noreferrer"&gt;https://docs.docker.com/get-started/docker-concepts/the-basics/what-is-a-container/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🧱 2. Docker Image
&lt;/h2&gt;

&lt;p&gt;A &lt;strong&gt;Docker image&lt;/strong&gt; is a blueprint used to create containers.&lt;/p&gt;

&lt;p&gt;It contains: - application code - dependencies - runtime - environment&lt;br&gt;
configuration&lt;/p&gt;

&lt;p&gt;Images are &lt;strong&gt;read-only templates&lt;/strong&gt; used to start containers.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker pull nginx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Learn more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/get-started/docker-concepts/the-basics/what-is-an-image/" rel="noopener noreferrer"&gt;https://docs.docker.com/get-started/docker-concepts/the-basics/what-is-an-image/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  📝 3. Dockerfile
&lt;/h2&gt;

&lt;p&gt;A &lt;strong&gt;Dockerfile&lt;/strong&gt; is a text file that defines how to build a Docker&lt;br&gt;
image.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="k"&gt;FROM&lt;/span&gt;&lt;span class="s"&gt; node:20&lt;/span&gt;
&lt;span class="k"&gt;WORKDIR&lt;/span&gt;&lt;span class="s"&gt; /app&lt;/span&gt;
&lt;span class="k"&gt;COPY&lt;/span&gt;&lt;span class="s"&gt; . .&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt;
&lt;span class="k"&gt;CMD&lt;/span&gt;&lt;span class="s"&gt; ["npm","start"]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Learn more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/build/concepts/dockerfile/" rel="noopener noreferrer"&gt;https://docs.docker.com/build/concepts/dockerfile/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🌐 4. Docker Hub
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Docker Hub&lt;/strong&gt; is the most popular public registry for Docker images.&lt;/p&gt;

&lt;p&gt;Developers use it to: - store container images - share images - download&lt;br&gt;
official images&lt;/p&gt;

&lt;p&gt;Examples of official images: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;nginx &lt;/li&gt;
&lt;li&gt;redis &lt;/li&gt;
&lt;li&gt;postgres &lt;/li&gt;
&lt;li&gt;node&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://hub.docker.com" rel="noopener noreferrer"&gt;https://hub.docker.com&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  🗄️ 5. Container Registry
&lt;/h2&gt;

&lt;p&gt;A &lt;strong&gt;container registry&lt;/strong&gt; stores Docker images so they can be pulled and&lt;br&gt;
deployed.&lt;/p&gt;

&lt;p&gt;Examples:&lt;/p&gt;

&lt;p&gt;Docker Hub&lt;br&gt;
&lt;a href="https://hub.docker.com" rel="noopener noreferrer"&gt;https://hub.docker.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Amazon ECR&lt;br&gt;
&lt;a href="https://aws.amazon.com/ecr/" rel="noopener noreferrer"&gt;https://aws.amazon.com/ecr/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Google Artifact Registry&lt;br&gt;
&lt;a href="https://cloud.google.com/artifact-registry" rel="noopener noreferrer"&gt;https://cloud.google.com/artifact-registry&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Azure Container Registry&lt;br&gt;
&lt;a href="https://azure.microsoft.com/products/container-registry" rel="noopener noreferrer"&gt;https://azure.microsoft.com/products/container-registry&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  💾 6. Docker Volumes
&lt;/h2&gt;

&lt;p&gt;Containers are &lt;strong&gt;ephemeral&lt;/strong&gt;, meaning their data disappears when the&lt;br&gt;
container stops.&lt;/p&gt;

&lt;p&gt;Docker &lt;strong&gt;volumes&lt;/strong&gt; provide persistent storage.&lt;/p&gt;

&lt;p&gt;Common use cases: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;databases &lt;/li&gt;
&lt;li&gt;logs &lt;/li&gt;
&lt;li&gt;uploads&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;-v&lt;/span&gt; mydata:/data postgres
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Learn more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/storage/volumes/" rel="noopener noreferrer"&gt;https://docs.docker.com/storage/volumes/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🌐 7. Docker Networking
&lt;/h2&gt;

&lt;p&gt;Docker networking allows containers to communicate with each other.&lt;/p&gt;

&lt;p&gt;Common network types: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;bridge &lt;/li&gt;
&lt;li&gt;host &lt;/li&gt;
&lt;li&gt;overlay&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker network create mynetwork
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Learn more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/network/" rel="noopener noreferrer"&gt;https://docs.docker.com/network/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  ⚙️ 8. Docker Compose
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Docker Compose&lt;/strong&gt; lets you run multi-container applications using a&lt;br&gt;
single YAML configuration.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;3"&lt;/span&gt;

&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;web&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;nginx&lt;/span&gt;
  &lt;span class="na"&gt;database&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;postgres&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Learn more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/compose/" rel="noopener noreferrer"&gt;https://docs.docker.com/compose/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  ⚡ 9. Docker Layer Caching
&lt;/h2&gt;

&lt;p&gt;Docker builds images using &lt;strong&gt;layers&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Each instruction in a Dockerfile creates a layer. Docker caches layers&lt;br&gt;
to speed up future builds.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="k"&gt;FROM&lt;/span&gt;&lt;span class="s"&gt; node:20&lt;/span&gt;
&lt;span class="k"&gt;COPY&lt;/span&gt;&lt;span class="s"&gt; package.json .&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt;
&lt;span class="k"&gt;COPY&lt;/span&gt;&lt;span class="s"&gt; . .&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Learn more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/build/cache/" rel="noopener noreferrer"&gt;https://docs.docker.com/build/cache/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🧠 10. Container Runtime
&lt;/h2&gt;

&lt;p&gt;A &lt;strong&gt;container runtime&lt;/strong&gt; is responsible for running containers.&lt;/p&gt;

&lt;p&gt;Examples include: - containerd - runc&lt;/p&gt;

&lt;p&gt;They manage: - container lifecycle - process execution - resource&lt;br&gt;
isolation&lt;/p&gt;

&lt;p&gt;Learn more: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/engine/" rel="noopener noreferrer"&gt;https://docs.docker.com/engine/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🚀 Final Thoughts
&lt;/h2&gt;

&lt;p&gt;Docker has transformed how software is built and deployed by making&lt;br&gt;
applications portable and reproducible.&lt;/p&gt;

&lt;p&gt;Understanding these concepts gives developers a strong foundation for&lt;br&gt;
working with containerized applications.&lt;/p&gt;

&lt;p&gt;Official Docker resources:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.docker.com" rel="noopener noreferrer"&gt;https://www.docker.com&lt;/a&gt;&lt;br&gt;
&lt;a href="https://docs.docker.com" rel="noopener noreferrer"&gt;https://docs.docker.com&lt;/a&gt;&lt;/p&gt;

</description>
      <category>docker</category>
      <category>devops</category>
      <category>aiops</category>
      <category>softwareengineering</category>
    </item>
    <item>
      <title>🤖 Docker in AI Production: Why It Matters (and What Breaks Without It)</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Fri, 06 Feb 2026 21:30:56 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/docker-in-ai-production-why-it-matters-and-what-breaks-without-it-oj6</link>
      <guid>https://dev.to/jasdeepsinghbhalla/docker-in-ai-production-why-it-matters-and-what-breaks-without-it-oj6</guid>
      <description>&lt;p&gt;Docker has become one of the most important tools in modern AI engineering.&lt;/p&gt;

&lt;p&gt;From model serving to agent execution, almost every AI platform today relies on containers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;LLM inference APIs
&lt;/li&gt;
&lt;li&gt;GPU-based training workloads
&lt;/li&gt;
&lt;li&gt;Retrieval-Augmented Generation (RAG) pipelines
&lt;/li&gt;
&lt;li&gt;Autonomous agents running tools
&lt;/li&gt;
&lt;li&gt;MCP server deployments
&lt;/li&gt;
&lt;li&gt;AI DevOps workflows
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But here’s the key point:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Docker is not the problem — Docker is what makes AI production possible.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This article explains why Docker is so valuable, and what kinds of AI failures teams face &lt;strong&gt;without containerization&lt;/strong&gt;, especially as MCP-powered agents become mainstream.&lt;/p&gt;




&lt;h2&gt;
  
  
  1. AI Systems Without Docker Are Hard to Reproduce
&lt;/h2&gt;

&lt;p&gt;Without Docker, teams run into:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;dependency mismatches
&lt;/li&gt;
&lt;li&gt;inconsistent Python environments
&lt;/li&gt;
&lt;li&gt;CUDA version conflicts
&lt;/li&gt;
&lt;li&gt;“works on my machine” model behavior
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;developer runs PyTorch 2.2
&lt;/li&gt;
&lt;li&gt;production server runs PyTorch 2.0
&lt;/li&gt;
&lt;li&gt;inference output changes subtly
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Docker solves this by packaging the runtime environment.&lt;/p&gt;




&lt;h2&gt;
  
  
  2. Model Serving Without Containers Becomes Deployment Chaos
&lt;/h2&gt;

&lt;p&gt;Deploying an LLM without Docker often means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;installing libraries manually on servers
&lt;/li&gt;
&lt;li&gt;configuring drivers by hand
&lt;/li&gt;
&lt;li&gt;repeating setup across environments
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With Docker, serving becomes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;--gpus&lt;/span&gt; all my-llm-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Portable, repeatable, automated.&lt;/p&gt;




&lt;h2&gt;
  
  
  3. MCP Tool Servers Need Isolation
&lt;/h2&gt;

&lt;p&gt;The Model Context Protocol (MCP) enables AI agents to call tools:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;filesystem tools
&lt;/li&gt;
&lt;li&gt;cloud APIs
&lt;/li&gt;
&lt;li&gt;databases
&lt;/li&gt;
&lt;li&gt;CI/CD automation
&lt;/li&gt;
&lt;li&gt;internal governance systems
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But MCP introduces a new requirement:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Tool execution must be sandboxed.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Running MCP servers without Docker means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;tools run directly on host machines
&lt;/li&gt;
&lt;li&gt;agents may access sensitive files
&lt;/li&gt;
&lt;li&gt;prompt injection can trigger real commands
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Docker provides safe boundaries:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;isolated filesystem
&lt;/li&gt;
&lt;li&gt;controlled networking
&lt;/li&gt;
&lt;li&gt;least-privilege execution
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  4. AI Agents Without Docker Become a Security Risk
&lt;/h2&gt;

&lt;p&gt;Modern AI agents are not passive chatbots.&lt;/p&gt;

&lt;p&gt;They can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;run shell commands
&lt;/li&gt;
&lt;li&gt;modify repositories
&lt;/li&gt;
&lt;li&gt;deploy infrastructure
&lt;/li&gt;
&lt;li&gt;call external APIs
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without Docker sandboxing, this creates risks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;credential leaks
&lt;/li&gt;
&lt;li&gt;unintended host access
&lt;/li&gt;
&lt;li&gt;tool poisoning attacks
&lt;/li&gt;
&lt;li&gt;container escape becomes host escape
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Docker Sandboxes and hardened images are now critical for safe agent execution.&lt;/p&gt;




&lt;h2&gt;
  
  
  5. Scaling AI Workloads Without Docker Is Expensive and Slow
&lt;/h2&gt;

&lt;p&gt;Without containers, scaling means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;configuring new servers manually
&lt;/li&gt;
&lt;li&gt;inconsistent runtime setups
&lt;/li&gt;
&lt;li&gt;slow onboarding of new nodes
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With Docker + orchestration (Kubernetes/ECS):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;replicas spin up predictably
&lt;/li&gt;
&lt;li&gt;environments stay consistent
&lt;/li&gt;
&lt;li&gt;scaling becomes automated
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  6. RAG Pipelines Without Docker Become Unmanageable
&lt;/h2&gt;

&lt;p&gt;A real RAG system includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;LLM server
&lt;/li&gt;
&lt;li&gt;embedding model
&lt;/li&gt;
&lt;li&gt;vector database
&lt;/li&gt;
&lt;li&gt;retriever service
&lt;/li&gt;
&lt;li&gt;MCP tool servers
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without Docker Compose, deployment becomes messy.&lt;/p&gt;

&lt;p&gt;With Compose:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;llm&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;vectordb&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;retriever&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;tools&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;One command brings the stack up:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  7. Observability Without Containers Gets Worse
&lt;/h2&gt;

&lt;p&gt;AI systems require monitoring:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;token throughput
&lt;/li&gt;
&lt;li&gt;hallucination rates
&lt;/li&gt;
&lt;li&gt;retrieval quality
&lt;/li&gt;
&lt;li&gt;agent tool calls
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Docker provides consistent logging + metrics hooks that integrate with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Prometheus
&lt;/li&gt;
&lt;li&gt;OpenTelemetry
&lt;/li&gt;
&lt;li&gt;Grafana
&lt;/li&gt;
&lt;li&gt;cloud observability
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without Docker, monitoring becomes inconsistent across machines.&lt;/p&gt;




&lt;h2&gt;
  
  
  8. Supply Chain Security Improves With Docker
&lt;/h2&gt;

&lt;p&gt;AI workloads depend on massive open-source stacks.&lt;/p&gt;

&lt;p&gt;Docker helps teams:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;pin base images
&lt;/li&gt;
&lt;li&gt;scan for vulnerabilities
&lt;/li&gt;
&lt;li&gt;enforce hardened runtimes
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tools like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Docker Scout
&lt;/li&gt;
&lt;li&gt;Docker Hardened Images
&lt;/li&gt;
&lt;li&gt;signed registries
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;are becoming mandatory for AI governance.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion: Docker Is the Foundation for Safe AI + MCP Deployment
&lt;/h2&gt;

&lt;p&gt;AI production introduces complexity:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;huge dependencies
&lt;/li&gt;
&lt;li&gt;GPU runtime requirements
&lt;/li&gt;
&lt;li&gt;agent tool execution
&lt;/li&gt;
&lt;li&gt;security threats like prompt injection
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Docker is what makes these systems:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;portable
&lt;/li&gt;
&lt;li&gt;reproducible
&lt;/li&gt;
&lt;li&gt;scalable
&lt;/li&gt;
&lt;li&gt;governable
&lt;/li&gt;
&lt;li&gt;secure
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And as MCP-powered AI agents become standard, Docker-style sandboxing will be non-negotiable.&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Docker didn’t create AI production problems.&lt;/em&gt;&lt;br&gt;&lt;br&gt;
&lt;em&gt;Docker is what prevents AI production from collapsing into chaos.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>docker</category>
      <category>containers</category>
      <category>ai</category>
      <category>programming</category>
    </item>
    <item>
      <title>🚢 Docker Explained: Everything You Need to Know (Beginner Pro)</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Fri, 06 Feb 2026 21:21:52 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/docker-explained-everything-you-need-to-know-beginner-pro-2hc8</link>
      <guid>https://dev.to/jasdeepsinghbhalla/docker-explained-everything-you-need-to-know-beginner-pro-2hc8</guid>
      <description>&lt;p&gt;Docker has become one of the most important tools in modern software engineering.&lt;/p&gt;

&lt;p&gt;Whether you're building microservices, deploying AI agents, running cloud workloads, or just trying to avoid the classic:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“It works on my machine…”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Docker solves a huge problem:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;consistent environments
&lt;/li&gt;
&lt;li&gt;fast deployments
&lt;/li&gt;
&lt;li&gt;portable applications
&lt;/li&gt;
&lt;li&gt;scalable infrastructure
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  1. What is Docker?
&lt;/h2&gt;

&lt;p&gt;Docker is a platform that allows you to package applications into &lt;strong&gt;containers&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;A container includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;application code
&lt;/li&gt;
&lt;li&gt;runtime (Python, Node, Java, etc.)
&lt;/li&gt;
&lt;li&gt;dependencies
&lt;/li&gt;
&lt;li&gt;system libraries
&lt;/li&gt;
&lt;li&gt;configuration
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So the app runs the same everywhere:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;laptop
&lt;/li&gt;
&lt;li&gt;server
&lt;/li&gt;
&lt;li&gt;cloud
&lt;/li&gt;
&lt;li&gt;CI/CD pipeline
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  2. Docker vs Virtual Machines
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;Virtual Machine&lt;/th&gt;
&lt;th&gt;Docker Container&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Includes OS?&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Startup time&lt;/td&gt;
&lt;td&gt;Minutes&lt;/td&gt;
&lt;td&gt;Seconds&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Resource usage&lt;/td&gt;
&lt;td&gt;Heavy&lt;/td&gt;
&lt;td&gt;Lightweight&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Best for&lt;/td&gt;
&lt;td&gt;Full OS virtualization&lt;/td&gt;
&lt;td&gt;App packaging&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  3. Key Docker Concepts
&lt;/h2&gt;

&lt;h3&gt;
  
  
  📦 Image
&lt;/h3&gt;

&lt;p&gt;An image is a blueprint/template.&lt;/p&gt;

&lt;h3&gt;
  
  
  🏗 Container
&lt;/h3&gt;

&lt;p&gt;A running instance of an image.&lt;/p&gt;

&lt;h3&gt;
  
  
  Dockerfile
&lt;/h3&gt;

&lt;p&gt;Defines how to build an image.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="k"&gt;FROM&lt;/span&gt;&lt;span class="s"&gt; python:3.11&lt;/span&gt;
&lt;span class="k"&gt;WORKDIR&lt;/span&gt;&lt;span class="s"&gt; /app&lt;/span&gt;
&lt;span class="k"&gt;COPY&lt;/span&gt;&lt;span class="s"&gt; . .&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; requirements.txt
&lt;span class="k"&gt;CMD&lt;/span&gt;&lt;span class="s"&gt; ["python", "main.py"]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  4. Docker Compose
&lt;/h2&gt;

&lt;p&gt;Docker Compose helps run multi-container apps.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;3.9"&lt;/span&gt;

&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;web&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;.&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;5000:5000"&lt;/span&gt;

  &lt;span class="na"&gt;redis&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;redis:latest&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  5. Docker Security Basics
&lt;/h2&gt;

&lt;p&gt;Best practices:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;use slim images
&lt;/li&gt;
&lt;li&gt;avoid running as root
&lt;/li&gt;
&lt;li&gt;scan images
&lt;/li&gt;
&lt;li&gt;manage secrets properly
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="k"&gt;USER&lt;/span&gt;&lt;span class="s"&gt; nobody&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  6. Docker + AI Agents + MCP Servers
&lt;/h2&gt;

&lt;p&gt;Docker is now powering AI agents safely:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Claude Code in sandboxes
&lt;/li&gt;
&lt;li&gt;MCP servers as containers
&lt;/li&gt;
&lt;li&gt;Secure tool execution
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Docker enables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;portable apps
&lt;/li&gt;
&lt;li&gt;reproducible builds
&lt;/li&gt;
&lt;li&gt;scalable infra
&lt;/li&gt;
&lt;li&gt;safe AI execution
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're learning cloud + AI + security, Docker is mandatory.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>containers</category>
      <category>ai</category>
      <category>virtualmachine</category>
    </item>
    <item>
      <title>Docker Compose for AI Agents: From Local Prototype to Production in One Workflow</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Fri, 06 Feb 2026 07:33:06 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/docker-compose-for-ai-agents-from-local-prototype-to-production-in-one-workflow-3a4m</link>
      <guid>https://dev.to/jasdeepsinghbhalla/docker-compose-for-ai-agents-from-local-prototype-to-production-in-one-workflow-3a4m</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wkmm6b317azf93dbgr0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wkmm6b317azf93dbgr0.png" alt=" " width="800" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AI agents are quickly becoming the next major platform shift in software engineering.&lt;/p&gt;

&lt;p&gt;They are no longer limited to answering questions in a chat window. Today’s agentic applications can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;reason over tasks
&lt;/li&gt;
&lt;li&gt;call external tools
&lt;/li&gt;
&lt;li&gt;query APIs
&lt;/li&gt;
&lt;li&gt;orchestrate workflows
&lt;/li&gt;
&lt;li&gt;interact with cloud infrastructure
&lt;/li&gt;
&lt;li&gt;operate autonomously inside DevOps pipelines
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But with this new power comes a familiar engineering challenge:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;How do we build, run, and ship AI agents reliably—from laptop to production?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The answer is surprisingly simple:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;Docker Compose.&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Compose is evolving from a local developer convenience into the backbone of agentic application deployment.&lt;/p&gt;




&lt;h2&gt;
  
  
  AI Agents Are More Than Just Models
&lt;/h2&gt;

&lt;p&gt;An AI agent is not just an LLM.&lt;/p&gt;

&lt;p&gt;A production-grade agentic system typically includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the agent runtime (LangGraph, CrewAI, Semantic Kernel, etc.)&lt;/li&gt;
&lt;li&gt;one or more LLM backends (OpenAI, local Llama, Amazon Bedrock)&lt;/li&gt;
&lt;li&gt;tool integrations (MCP servers, APIs, databases)&lt;/li&gt;
&lt;li&gt;memory/state stores (Redis, Postgres, vector DBs)&lt;/li&gt;
&lt;li&gt;observability (logs, tracing, metrics)&lt;/li&gt;
&lt;li&gt;security boundaries (network + identity controls)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In other words:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Agents are distributed systems.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;And distributed systems need orchestration.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Docker Compose Fits Agents Perfectly
&lt;/h2&gt;

&lt;p&gt;Docker Compose has always been good at one thing:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Defining multi-service applications with a single declarative file.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Agentic apps are inherently multi-service, which makes Compose a natural match.&lt;/p&gt;

&lt;p&gt;Helpful references:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.docker.com/compose/" rel="noopener noreferrer"&gt;Docker Compose Overview&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://compose-spec.io/" rel="noopener noreferrer"&gt;Compose Specification&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Compose gives you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;reproducible local environments
&lt;/li&gt;
&lt;li&gt;consistent dependency wiring
&lt;/li&gt;
&lt;li&gt;portable deployment artifacts
&lt;/li&gt;
&lt;li&gt;scalable service definitions
&lt;/li&gt;
&lt;li&gt;security controls through isolation
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And most importantly:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;No new workflow.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  A Minimal &lt;code&gt;docker-compose.yml&lt;/code&gt; for an AI Agent
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;3.9"&lt;/span&gt;

&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;agent&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;./agent&lt;/span&gt;
    &lt;span class="na"&gt;container_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ai-agent&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;8080:8080"&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;MODEL_PROVIDER=openai&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;MCP_SERVER_URL=http://mcp-server:9000&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;REDIS_HOST=redis&lt;/span&gt;
    &lt;span class="na"&gt;depends_on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;redis&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;mcp-server&lt;/span&gt;

  &lt;span class="na"&gt;redis&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;redis:7&lt;/span&gt;
    &lt;span class="na"&gt;container_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;agent-memory&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;6379:6379"&lt;/span&gt;

  &lt;span class="na"&gt;mcp-server&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;myorg/mcp-tool-server:latest&lt;/span&gt;
    &lt;span class="na"&gt;container_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;mcp-gateway&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;9000:9000"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run everything locally:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose up &lt;span class="nt"&gt;--build&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Adding Local Model Execution (Docker Model Runner)
&lt;/h2&gt;

&lt;p&gt;Docker now supports running open-source models locally through &lt;strong&gt;Docker Model Runner&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.docker.com/ai/model-runner/" rel="noopener noreferrer"&gt;Docker Model Runner Documentation&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This makes it easy to test agentic workflows without sending data to external providers.&lt;/p&gt;




&lt;h2&gt;
  
  
  Tool Integration with MCP Servers
&lt;/h2&gt;

&lt;p&gt;The &lt;strong&gt;Model Context Protocol (MCP)&lt;/strong&gt; is emerging as a standard way for AI agents to connect to tools and services.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://modelcontextprotocol.io/" rel="noopener noreferrer"&gt;Model Context Protocol (MCP) Official Site&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Docker is also building MCP-native infrastructure, including MCP Gateway and Hub MCP servers.&lt;/p&gt;




&lt;h2&gt;
  
  
  From Development to Production
&lt;/h2&gt;

&lt;p&gt;Docker is pushing Compose into production workflows through:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://www.docker.com/products/docker-offload/" rel="noopener noreferrer"&gt;Docker Offload&lt;/a&gt; (GPU-backed remote engines)
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.docker.com/products/docker-cloud/" rel="noopener noreferrer"&gt;Docker Cloud&lt;/a&gt; deployments
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.docker.com/cloud/" rel="noopener noreferrer"&gt;Docker Cloud Documentation&lt;/a&gt; for deploying Compose apps
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now the same Compose file can support:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;laptop development
&lt;/li&gt;
&lt;li&gt;staging deployment
&lt;/li&gt;
&lt;li&gt;production rollout
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Production Considerations for AI Agents
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Secrets Management
&lt;/h3&gt;

&lt;p&gt;Never hardcode API keys.&lt;/p&gt;

&lt;p&gt;Use:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.docker.com/engine/swarm/secrets/" rel="noopener noreferrer"&gt;Docker Secrets&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.aws.amazon.com/secretsmanager/" rel="noopener noreferrer"&gt;AWS Secrets Manager&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Network Isolation
&lt;/h3&gt;

&lt;p&gt;Agents should not have unrestricted outbound access.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.docker.com/compose/networking/" rel="noopener noreferrer"&gt;Compose Networking Guide&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Least Privilege Tool Access
&lt;/h3&gt;

&lt;p&gt;Your MCP gateway should enforce scoped permissions and audit logging.&lt;/p&gt;

&lt;p&gt;On AWS, follow:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html" rel="noopener noreferrer"&gt;IAM Security Best Practices&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Observability Built In
&lt;/h3&gt;

&lt;p&gt;Agents are long-running systems.&lt;/p&gt;

&lt;p&gt;Add logging + tracing early:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://opentelemetry.io/" rel="noopener noreferrer"&gt;OpenTelemetry&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.aws.amazon.com/xray/" rel="noopener noreferrer"&gt;AWS X-Ray&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Secure Supply Chain for Agent Containers
&lt;/h2&gt;

&lt;p&gt;AI agents must run on trusted, secure base images.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://www.docker.com/products/docker-hardened-images/" rel="noopener noreferrer"&gt;Docker Hardened Images&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.docker.com/scout/" rel="noopener noreferrer"&gt;Docker Scout&lt;/a&gt; for vulnerability scanning
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Final Thought
&lt;/h2&gt;

&lt;p&gt;AI agents are becoming core application infrastructure.&lt;/p&gt;

&lt;p&gt;Docker Compose provides the missing workflow layer:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Compose. Build. Deploy. Agents—From Dev to Prod.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>docker</category>
      <category>ai</category>
      <category>agents</category>
      <category>containers</category>
    </item>
    <item>
      <title>Using a Docker Sandbox for a Coding Agent</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Tue, 03 Feb 2026 23:57:09 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/using-a-docker-sandbox-for-a-coding-agent-3ccb</link>
      <guid>https://dev.to/jasdeepsinghbhalla/using-a-docker-sandbox-for-a-coding-agent-3ccb</guid>
      <description>&lt;p&gt;This guide shows a concrete, end-to-end example of how &lt;strong&gt;Docker Sandboxes&lt;/strong&gt; can be created and used to safely run autonomous coding agents that can install packages, modify files, and even run Docker — without touching your host machine.&lt;/p&gt;




&lt;h2&gt;
  
  
  Scenario
&lt;/h2&gt;

&lt;p&gt;You want to let a coding agent:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Modify a real codebase
&lt;/li&gt;
&lt;li&gt;Install system dependencies
&lt;/li&gt;
&lt;li&gt;Build and run containers
&lt;/li&gt;
&lt;li&gt;Run unattended with permissive flags
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;…but &lt;strong&gt;without risking your laptop or credentials&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Docker Sandboxes solve this by running the agent inside a disposable &lt;strong&gt;microVM&lt;/strong&gt; with only your project workspace mounted.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 1: Create a Sandbox
&lt;/h2&gt;

&lt;p&gt;Create a new sandbox using your local project directory as the workspace:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker sandbox create   &lt;span class="nt"&gt;--name&lt;/span&gt; agent-sandbox   &lt;span class="nt"&gt;--workspace&lt;/span&gt; ./my-project
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What this does:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creates a dedicated microVM
&lt;/li&gt;
&lt;li&gt;Mounts only &lt;code&gt;./my-project&lt;/code&gt; into the sandbox
&lt;/li&gt;
&lt;li&gt;Keeps your OS, home directory, and secrets isolated
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 2: Enter the Sandbox
&lt;/h2&gt;

&lt;p&gt;Start an interactive shell inside the sandbox:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker sandbox &lt;span class="nb"&gt;exec &lt;/span&gt;agent-sandbox bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You are now &lt;strong&gt;inside the sandbox&lt;/strong&gt;, not your host machine.&lt;/p&gt;

&lt;p&gt;From this point on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Any package installs
&lt;/li&gt;
&lt;li&gt;Any config changes
&lt;/li&gt;
&lt;li&gt;Any Docker commands
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;…are fully isolated.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 3: Run a Coding Agent (Unattended)
&lt;/h2&gt;

&lt;p&gt;Inside the sandbox, run your coding agent in permissive mode:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;claude-code run   &lt;span class="nt"&gt;--dangerously-skip-permissions&lt;/span&gt;   &lt;span class="nt"&gt;--project&lt;/span&gt; /workspace
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Why this is safe:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The agent runs inside a microVM
&lt;/li&gt;
&lt;li&gt;Only the project directory is writable
&lt;/li&gt;
&lt;li&gt;No access to your host OS, SSH keys, or credentials
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is the &lt;strong&gt;intended workflow&lt;/strong&gt; for Docker Sandboxes.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 4: Let the Agent Install Dependencies
&lt;/h2&gt;

&lt;p&gt;The agent can freely modify the environment:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;apt-get update
apt-get &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; nodejs npm
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;No permission prompts.&lt;br&gt;&lt;br&gt;
No approval loops.&lt;br&gt;&lt;br&gt;
No risk to your machine.&lt;/p&gt;


&lt;h2&gt;
  
  
  Step 5: Let the Agent Use Docker
&lt;/h2&gt;

&lt;p&gt;Inside the sandbox, the agent can build and run containers:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker build &lt;span class="nt"&gt;-t&lt;/span&gt; my-app &lt;span class="nb"&gt;.&lt;/span&gt;
docker run &lt;span class="nt"&gt;-p&lt;/span&gt; 8080:8080 my-app
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Important notes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This does &lt;strong&gt;not&lt;/strong&gt; use your host Docker daemon
&lt;/li&gt;
&lt;li&gt;Containers run entirely inside the sandbox microVM
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This capability is what makes Docker Sandboxes fundamentally different from regular containers.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 6: Review the Results on the Host
&lt;/h2&gt;

&lt;p&gt;Exit the sandbox and inspect the changes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git diff
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You’ll see only the &lt;strong&gt;intentional code changes&lt;/strong&gt; made by the agent.&lt;/p&gt;

&lt;p&gt;No stray system packages.&lt;br&gt;&lt;br&gt;
No modified OS files.&lt;br&gt;&lt;br&gt;
No lingering background processes.&lt;/p&gt;


&lt;h2&gt;
  
  
  Step 7: Delete the Sandbox
&lt;/h2&gt;

&lt;p&gt;When you’re done, delete the sandbox:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker sandbox delete agent-sandbox
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The microVM is destroyed immediately.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The environment is wiped
&lt;/li&gt;
&lt;li&gt;Nothing persists except your code changes
&lt;/li&gt;
&lt;li&gt;You start clean the next time
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Why This Pattern Works
&lt;/h2&gt;

&lt;p&gt;Docker Sandboxes give agents:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A real operating system
&lt;/li&gt;
&lt;li&gt;Package managers and system tools
&lt;/li&gt;
&lt;li&gt;Docker access
&lt;/li&gt;
&lt;li&gt;Full autonomy
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While giving you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Strong isolation via microVMs
&lt;/li&gt;
&lt;li&gt;Disposable environments
&lt;/li&gt;
&lt;li&gt;Zero host contamination
&lt;/li&gt;
&lt;li&gt;Confidence to use permissive agent modes
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  When to Use Docker Sandboxes
&lt;/h2&gt;

&lt;p&gt;This pattern is ideal when:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Running coding agents unattended
&lt;/li&gt;
&lt;li&gt;Using flags like &lt;code&gt;--dangerously-skip-permissions&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Allowing agents to install tools dynamically
&lt;/li&gt;
&lt;li&gt;Letting agents build and run containers
&lt;/li&gt;
&lt;li&gt;Experimenting aggressively without fear
&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;Agents need freedom.&lt;br&gt;&lt;br&gt;
Your machine doesn’t.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>containers</category>
      <category>ai</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Skip the Cloud, Not the Control: Running AI Models Locally with Docker Model Runner</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Tue, 03 Feb 2026 23:52:15 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/skip-the-cloud-not-the-control-running-ai-models-locally-with-docker-model-runner-aef</link>
      <guid>https://dev.to/jasdeepsinghbhalla/skip-the-cloud-not-the-control-running-ai-models-locally-with-docker-model-runner-aef</guid>
      <description>&lt;p&gt;AI development is moving fast—but for many teams, the default workflow still means shipping data to the cloud, managing tokens, and worrying about privacy, latency, and cost. What if you could &lt;strong&gt;run powerful AI models locally&lt;/strong&gt;, using the same Docker tools you already trust in production?&lt;/p&gt;

&lt;p&gt;That’s exactly what &lt;strong&gt;&lt;a href="https://www.docker.com/products/docker-model-runner/" rel="noopener noreferrer"&gt;Docker Model Runner&lt;/a&gt;&lt;/strong&gt; enables.&lt;/p&gt;

&lt;p&gt;In this post, we’ll walk through:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What Docker Model Runner is
&lt;/li&gt;
&lt;li&gt;Why running models locally matters
&lt;/li&gt;
&lt;li&gt;How to run AI models with a single Docker command
&lt;/li&gt;
&lt;li&gt;How it fits naturally into real production and CI/CD workflows
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Why Local-First AI Matters
&lt;/h2&gt;

&lt;p&gt;Cloud-based LLM APIs are convenient—but they come with tradeoffs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;💸 &lt;strong&gt;Token costs add up quickly&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;🔒 &lt;strong&gt;Sensitive data leaves your machine&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;🌐 &lt;strong&gt;Latency and rate limits slow iteration&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;⚙️ &lt;strong&gt;Limited control over model behavior&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Running models locally flips that equation. You keep full ownership of your data, avoid per-request costs, and iterate faster—especially during development and testing.&lt;/p&gt;

&lt;p&gt;Docker Model Runner is designed to make that local-first approach simple.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Is Docker Model Runner?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Docker Model Runner&lt;/strong&gt; lets you &lt;strong&gt;run AI models locally using familiar Docker CLI commands&lt;/strong&gt;. Models are packaged and distributed as &lt;strong&gt;OCI artifacts&lt;/strong&gt;, meaning they work seamlessly with existing Docker infrastructure like &lt;strong&gt;&lt;a href="https://hub.docker.com/" rel="noopener noreferrer"&gt;Docker Hub&lt;/a&gt;&lt;/strong&gt;, &lt;strong&gt;&lt;a href="https://docs.docker.com/compose/" rel="noopener noreferrer"&gt;Docker Compose&lt;/a&gt;&lt;/strong&gt;, and CI pipelines.&lt;/p&gt;

&lt;p&gt;It supports:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Any &lt;strong&gt;OCI-compliant registry&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Popular open-source LLMs&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OpenAI-compatible APIs&lt;/strong&gt; for easy app integration&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Native GPU acceleration&lt;/strong&gt; for high-performance inference&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All without reinventing your toolchain.&lt;/p&gt;




&lt;h2&gt;
  
  
  Running Your First Model
&lt;/h2&gt;

&lt;p&gt;If you already use Docker, you’re 90% of the way there.&lt;/p&gt;

&lt;p&gt;Running a model locally is as simple as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker model run &amp;lt;model-name&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That’s it.&lt;/p&gt;

&lt;p&gt;Docker Model Runner pulls the model from an OCI registry, initializes it locally, and exposes an inference endpoint you can immediately start using.&lt;/p&gt;

&lt;p&gt;No Python environments.&lt;br&gt;&lt;br&gt;
No custom scripts.&lt;br&gt;&lt;br&gt;
No fragile dependencies.&lt;/p&gt;

&lt;p&gt;For a full walkthrough, see the &lt;strong&gt;&lt;a href="https://docs.docker.com/ai/model-runner/" rel="noopener noreferrer"&gt;Docker Model Runner Quick Start Guide&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Models Ready to Go
&lt;/h2&gt;

&lt;p&gt;You can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Explore a &lt;strong&gt;curated catalog of open-source AI models&lt;/strong&gt; on &lt;strong&gt;&lt;a href="https://hub.docker.com/search?q=model&amp;amp;type=model" rel="noopener noreferrer"&gt;Docker Hub&lt;/a&gt;&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Pull models directly from &lt;strong&gt;&lt;a href="https://huggingface.co/models" rel="noopener noreferrer"&gt;Hugging Face&lt;/a&gt;&lt;/strong&gt; using OCI-compatible workflows
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Because models are OCI artifacts, they’re:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Versioned&lt;/li&gt;
&lt;li&gt;Portable&lt;/li&gt;
&lt;li&gt;Easy to share across teams&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This makes collaboration and reproducibility dramatically simpler.&lt;/p&gt;




&lt;h2&gt;
  
  
  Easy Integration with Your Apps
&lt;/h2&gt;

&lt;p&gt;Docker Model Runner supports &lt;strong&gt;OpenAI-compatible APIs&lt;/strong&gt;, which means many existing apps work out of the box.&lt;/p&gt;

&lt;p&gt;You can connect it to frameworks like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://spring.io/projects/spring-ai" rel="noopener noreferrer"&gt;Spring AI&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://www.langchain.com/" rel="noopener noreferrer"&gt;LangChain&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://github.com/open-webui/open-webui" rel="noopener noreferrer"&gt;OpenWebUI&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Your app talks to a local endpoint—but behaves as if it’s using a hosted API.&lt;/p&gt;

&lt;p&gt;This makes swapping between local development and production workflows painless.&lt;/p&gt;




&lt;h2&gt;
  
  
  GPU Acceleration Without the Headaches
&lt;/h2&gt;

&lt;p&gt;For teams running on capable hardware, Docker Model Runner supports &lt;strong&gt;native GPU acceleration&lt;/strong&gt;, unlocking fast, efficient inference on your local machine.&lt;/p&gt;

&lt;p&gt;No manual CUDA setup.&lt;br&gt;&lt;br&gt;
No driver gymnastics.&lt;br&gt;&lt;br&gt;
Just Docker doing what it does best: abstracting complexity.&lt;/p&gt;

&lt;p&gt;Learn more about GPU support in &lt;strong&gt;&lt;a href="https://docs.docker.com/desktop/gpu/" rel="noopener noreferrer"&gt;Docker Desktop&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Built for Real Production Workflows
&lt;/h2&gt;

&lt;p&gt;Docker Model Runner isn’t just a dev toy—it’s designed to scale across teams:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use &lt;strong&gt;&lt;a href="https://docs.docker.com/compose/" rel="noopener noreferrer"&gt;Docker Compose&lt;/a&gt;&lt;/strong&gt; for multi-service applications
&lt;/li&gt;
&lt;li&gt;Integrate with &lt;strong&gt;&lt;a href="https://testcontainers.com/" rel="noopener noreferrer"&gt;Testcontainers&lt;/a&gt;&lt;/strong&gt; for AI-powered testing
&lt;/li&gt;
&lt;li&gt;Package and publish models securely to &lt;strong&gt;&lt;a href="https://hub.docker.com/" rel="noopener noreferrer"&gt;Docker Hub&lt;/a&gt;&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Manage access and permissions for enterprise teams
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Because it’s Docker-native, it fits naturally into CI/CD pipelines and existing governance models.&lt;/p&gt;




&lt;h2&gt;
  
  
  When Should You Use Docker Model Runner?
&lt;/h2&gt;

&lt;p&gt;Docker Model Runner is ideal when you want to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Prototype AI features without cloud costs
&lt;/li&gt;
&lt;li&gt;Keep sensitive data fully local
&lt;/li&gt;
&lt;li&gt;Test models before production deployment
&lt;/li&gt;
&lt;li&gt;Standardize AI workflows across teams
&lt;/li&gt;
&lt;li&gt;Avoid vendor lock-in
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you already trust Docker in production, this is the missing piece for AI.&lt;/p&gt;




&lt;h2&gt;
  
  
  Get Started Today
&lt;/h2&gt;

&lt;p&gt;Local AI doesn’t have to be complicated.&lt;/p&gt;

&lt;p&gt;With Docker Model Runner, you can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run LLMs locally
&lt;/li&gt;
&lt;li&gt;Keep control of your data
&lt;/li&gt;
&lt;li&gt;Cut costs
&lt;/li&gt;
&lt;li&gt;Use the Docker tools you already know
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://www.docker.com/products/docker-model-runner/" rel="noopener noreferrer"&gt;Try Docker Model Runner&lt;/a&gt;&lt;/strong&gt; and bring AI development into your local workflow.&lt;/p&gt;

&lt;p&gt;Hassle-free local inference starts here 🚀&lt;/p&gt;

</description>
      <category>docker</category>
      <category>programming</category>
      <category>ai</category>
      <category>mcp</category>
    </item>
    <item>
      <title>Running Claude Code with Docker</title>
      <dc:creator>Jasdeep Singh Bhalla</dc:creator>
      <pubDate>Tue, 03 Feb 2026 23:42:27 +0000</pubDate>
      <link>https://dev.to/jasdeepsinghbhalla/running-claude-code-with-docker-7fj</link>
      <guid>https://dev.to/jasdeepsinghbhalla/running-claude-code-with-docker-7fj</guid>
      <description>&lt;p&gt;&lt;em&gt;&lt;em&gt;A beginner-friendly setup for private, on-device coding agents&lt;/em&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Modern AI coding tools are powerful—but many developers don’t want their source code leaving their machine. The good news is that Claude Code can run entirely &lt;strong&gt;locally&lt;/strong&gt;, powered by Docker.&lt;/p&gt;

&lt;p&gt;This guide shows how to connect &lt;a href="https://claude.ai/" rel="noopener noreferrer"&gt;Claude Code&lt;/a&gt; to &lt;a href="https://www.docker.com/products/model-runner/" rel="noopener noreferrer"&gt;Docker Model Runner&lt;/a&gt; so you can use a local language model for agentic coding, with minimal setup and full control over your data.&lt;/p&gt;




&lt;h2&gt;
  
  
  What This Setup Does
&lt;/h2&gt;

&lt;p&gt;Instead of sending prompts to a hosted API, Claude Code talks to a &lt;strong&gt;local model running inside Docker&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;High-level flow:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Claude Code (terminal)
        ↓
Docker Model Runner
        ↓
Local LLM container
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your repository stays local. No API keys required.&lt;/p&gt;




&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Make sure you have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Docker Desktop installed and running
&lt;/li&gt;
&lt;li&gt;Claude Code installed on your system
&lt;/li&gt;
&lt;li&gt;At least 16 GB of RAM (recommended)&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 1: Install Claude Code
&lt;/h2&gt;

&lt;h3&gt;
  
  
  macOS / Linux
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-fsSL&lt;/span&gt; https://claude.ai/install.sh | bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Windows (PowerShell)
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;irm&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;https://claude.ai/install.ps1&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;iex&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify installation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;claude &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Step 2: Enable Docker Model Runner
&lt;/h2&gt;

&lt;p&gt;Docker Model Runner lets Docker pull, run, and serve large language models locally.&lt;/p&gt;

&lt;p&gt;If you’re using Docker Desktop, enable TCP access:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker desktop &lt;span class="nb"&gt;enable &lt;/span&gt;model-runner &lt;span class="nt"&gt;--tcp&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once enabled, the local API will be available at:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;http://localhost:12434
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You only need to do this once.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 3: Pull a Local Model
&lt;/h2&gt;

&lt;p&gt;Download a model using Docker’s model CLI. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker model pull gpt-oss
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;List available models:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker model &lt;span class="nb"&gt;ls&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Step 4: Increase the Context Window
&lt;/h2&gt;

&lt;p&gt;For real-world repositories, a larger context window helps significantly.&lt;/p&gt;

&lt;p&gt;Create a new model variant with a bigger context (example: 32K tokens):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker model package   &lt;span class="nt"&gt;--from&lt;/span&gt; ai/gpt-oss   &lt;span class="nt"&gt;--context-size&lt;/span&gt; 32000   gpt-oss:32k
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This creates a new tagged model without modifying the original.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 5: Connect Claude Code to Docker
&lt;/h2&gt;

&lt;p&gt;Claude Code supports custom API endpoints via an environment variable.&lt;/p&gt;

&lt;p&gt;Run Claude Code like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;ANTHROPIC_BASE_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:12434 claude &lt;span class="nt"&gt;--model&lt;/span&gt; gpt-oss:32k &lt;span class="s2"&gt;"Summarize this repository."&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Claude Code now sends all requests to your &lt;strong&gt;local Docker model&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 6: Make the Setup Persistent
&lt;/h2&gt;

&lt;p&gt;To avoid setting the environment variable every time, add it to your shell profile.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# ~/.bashrc or ~/.zshrc&lt;/span&gt;
&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;ANTHROPIC_BASE_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:12434
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you can simply run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;claude &lt;span class="nt"&gt;--model&lt;/span&gt; gpt-oss:32k &lt;span class="s2"&gt;"Explain the main service logic."&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Optional: Inspect Claude Code Requests
&lt;/h2&gt;

&lt;p&gt;You can inspect the raw requests sent by Claude Code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker model requests &lt;span class="nt"&gt;--model&lt;/span&gt; gpt-oss:32k | jq &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is useful for debugging, learning prompt structure, and understanding how agentic coding tools work.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Run Claude Code Locally?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Full control over your source code
&lt;/li&gt;
&lt;li&gt;No API keys or usage-based billing
&lt;/li&gt;
&lt;li&gt;Offline-friendly development
&lt;/li&gt;
&lt;li&gt;Reproducible setup with Docker
&lt;/li&gt;
&lt;li&gt;Custom models and context sizes
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Next Steps
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Try different local coding models
&lt;/li&gt;
&lt;li&gt;Tune context size for large repositories
&lt;/li&gt;
&lt;li&gt;Pair with Docker sandboxes for safe execution
&lt;/li&gt;
&lt;li&gt;Use this setup as a foundation for custom coding agents
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Happy hacking 🚀&lt;/p&gt;

</description>
      <category>docker</category>
      <category>claudecode</category>
      <category>programming</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
