DEV Community

Cover image for CI/CD in the Era of AI and Platform Engineering: A Deep Dive into Dagger CI (Part 1)
Sami Chibani
Sami Chibani

Posted on

CI/CD in the Era of AI and Platform Engineering: A Deep Dive into Dagger CI (Part 1)

Part 1: The CI/CD Bottleneck Nobody Talks About

Platform Engineering matured how we provision infrastructure. It hasn't touched how we build and deploy code.

CI/CD was, a decade ago, at the heart of the DevOps revolution. It deeply transformed the way we ship software by providing continuous automation workflows from code to production. It hasn't changed much since then. And like every tool that has been continuously used for so long, the more years pass, the more the flaws and frustrations seem obvious. The "traditional" way of building CI/CD nowadays often feels cumbersome and rusty.

Among the almost always recurrent flaws that any DevOps engineer has witnessed:

  • Lack of development ergonomics. For most technologies, the only way to really test pipelines is by pushing the code to the server, waiting, and iterating. It's inefficient and time-consuming. The feedback loop for CI/CD code is worse than for any other code in your organization.

  • YAML sprawl. As soon as pipelines require more logic and complexity, you rapidly end up with thousands of YAML files to maintain across repositories, each slightly different, each owned by whoever touched it last.

  • Opaque failure investigation. When things go wrong in a pipeline, there is no intelligence. Developers sift through thousands of log lines to manually investigate, with no programmatic way to diagnose or recover.

Meanwhile, Infrastructure-as-Code solved provisioning. GitOps solved configuration drift. Internal Developer Platforms gave teams golden paths for spinning up services, databases, and environments. But open any repository in your organization and look at the CI/CD: it's still bespoke YAML. In a world where platform teams build abstractions for everything else, CI/CD remains the one piece of developer infrastructure that every team reinvents from scratch.

This series explores what a CI/CD pipeline could look like in 2026, one that addresses all three of these flaws, at the intersection of Platform Engineering and AI agents. To do that, we'll use one of the most exciting tools I've seen in the CI/CD landscape in a long time: Dagger. Dagger reimagines pipelines as typed functions written in real programming languages, running inside containers with deterministic caching, portable across any CI engine. But what makes it particularly relevant for this series is that it doesn't stop at build automation: it has native LLM integration, a composable module system, and an emerging agent ecosystem that lets you go from reusable pipeline modules to AI-powered CI/CD workflows. It's the right tool to explore what happens when Platform Engineering meets AI agents in the CI/CD space.

We'll build pipelines as real code, decouple them from the infrastructure they run on, package them as reusable modules, and eventually compose AI agents that handle review, build, and deployment from a single natural language command.

The Series at a Glance

This is a 4-part series. Each part builds on the previous one, progressively transforming how we think about CI/CD:

  1. Part 1 (this article): Pipelines as real code. We replace YAML with typed, testable, locally-runnable pipeline functions using Dagger, and build a complete CI/CD pipeline for an Angular + FastAPI stack deployed to Google Cloud.

  2. Part 2: Decoupling Pipelines from Infrastructure: Same pipeline code, different runners. We explore three options for scaling: GitHub Actions (free tier), managed runners with Depot.dev, and self-hosted Kubernetes runners with ARC, including a Terraform module for production deployments.

  3. Part 3: From Scripts to a Platform: Your CI/CD Module Library: Modules as the unit of CI/CD reuse. We build a two-layer architecture: public daggerverse modules for generic operations (GCP Auth, Cloud Run, Firebase) wrapped by private organizational modules that enforce compliance, security, and conventions.

  4. Part 4: The AI-Native CI/CD Stack: Agents, Modules, and Spec-Driven Development: We leverage the agentic capabilities of Dagger to enhance the developer experience, automate CI failure remediation, and explore what an Internal Developer Factory looks like when built on Spec-Driven Development principles. The pipeline itself stays fast and deterministic, with no LLM in the hot path.

But first: let's look at the root cause.


The YAML Illusion

YAML-based CI/CD looks simple. A few lines, a run: step, and you have a pipeline. But that simplicity is an illusion. The moment a real-world project needs conditional logic, secret management, reusable workflows, matrix builds, or cross-repository dependencies, every CI platform reaches for its own layer of abstractions on top of YAML.

GitHub Actions has reusable workflows, composite actions, and a marketplace of third-party actions you import by reference. GitLab CI has include:, extends:, and multi-project pipelines. CircleCI has orbs. Azure DevOps has templates and task groups. Each platform reinvented inheritance, composition, and parameterization (poorly) on top of a configuration format that was never designed for it.

The result is a system that is declarative in appearance but imperative in practice. A developer looking at a .github/workflows/deploy.yml that imports three reusable workflows, each with their own if: conditions and secret references, is not reading a simple configuration file. They're reading a distributed program written in a language with no type system, no IDE support, no debugger, and no way to run it locally.

This is the fundamental problem: CI/CD, as it exists today, is not part of the software factory. It sits outside the development workflow. It can't be run on a developer's machine. It can't be unit tested. It can't be refactored with confidence. It can't be debugged step by step. It doesn't benefit from the tooling, the practices, or the discipline that every other piece of software in the organization does.

And yet it's the most critical automation in your delivery chain, the one that decides whether code reaches production.


CI/CD as Software

The fix isn't a better YAML syntax. It's a paradigm shift: treat CI/CD as an integral part of the software factory.

That means CI/CD primitives (building a container, running tests, publishing an artifact, deploying a service) should be programmatic abstractions. Written in real programming languages. With types, functions, imports, error handling. Runnable on a developer's laptop. Testable. Debuggable. Versioned alongside the application code. Reviewable in the same pull request.

This is what Dagger does. It decomposes CI/CD operations into typed functions, available through SDKs in Python, Go, and TypeScript. A pipeline is not a configuration file interpreted by a remote server. It's code that runs inside containers, with content-addressed caching at every step.

A note on language choice — Throughout this series, we'll use Python for all our examples. But everything we write here works identically in Go, TypeScript, or Java. Dagger's SDKs expose the same API surface and the same types across all supported languages — the pipeline behavior is determined by the engine, not the SDK. Pick the language your team is most comfortable with.

from typing import Annotated

import dagger
from dagger import Doc, dag, function, object_type


@object_type
class MyPipeline:
    @function
    async def build(
        self,
        source: Annotated[dagger.Directory, Doc("Application source code")],
    ) -> dagger.Container:
        """Build the application container."""
        return (
            dag.container()
            .from_("python:3.12-slim")
            .with_directory("/app", source)
            .with_workdir("/app")
            .with_exec(["pip", "install", "-r", "requirements.txt"])
            .with_entrypoint(["python", "main.py"])
        )
Enter fullscreen mode Exit fullscreen mode

This is a typed, composable function that:

  • Runs on your laptop with dagger call build --source=.
  • Runs in GitHub Actions with the exact same command
  • Runs in GitLab CI with the exact same command
  • Can be debugged, unit tested, and refactored like any other code

No YAML translation. No platform-specific quirks. The same code, the same command, everywhere.

The shift is subtle but significant. CI/CD is no longer a separate discipline with its own language, its own debugging workflow, and its own specialists. It's software. Developers can read it, understand it, modify it, and run it, the same way they do with application code. That's the self-service promise of Platform Engineering applied to the one area it hadn't reached yet.


Three Properties That Make It Work

When CI/CD becomes software, three properties emerge that YAML pipelines fundamentally cannot provide.

1. Container-Based Execution

Every operation happens inside a container. When you write:

dag.container().from_("python:3.12-slim").with_exec(["pip", "install", "pytest"])
Enter fullscreen mode Exit fullscreen mode

You're building a container execution graph. The engine takes this graph and executes it efficiently. Your pipeline is reproducible (same inputs, same outputs), isolated (no "works on my machine"), and portable (containers run anywhere).

This is what enables local execution. The same container graph that runs on a CI runner runs on your laptop. No emulation, no mocking, just the real thing. A developer can iterate on a pipeline the same way they iterate on application code: change, run, observe, repeat. The feedback loop goes from "push and wait 10 minutes" to "run and see in seconds."

2. Content-Addressed Caching

Every operation is cached based on the hash of its inputs:

@function
async def test(
    self,
    source: Annotated[dagger.Directory, Doc("Application source code")],
) -> str:
    """Run tests with intelligent caching."""
    return await (
        dag.container()
        .from_("python:3.12-slim")
        .with_directory("/app", source)       # Cached if source unchanged
        .with_exec(["pip", "install", "-r", "requirements.txt"])  # Cached if requirements unchanged
        .with_exec(["pytest"])                 # Only re-runs if dependencies changed
        .stdout()
    )
Enter fullscreen mode Exit fullscreen mode

Change a test file? Only the test execution re-runs. Change requirements.txt? Dependencies reinstall, but the base image stays cached. This is content-addressed: Dagger hashes the actual content, not timestamps.

Because it's deterministic, caching works identically on your laptop and in CI. A developer running dagger call test --source=. after changing one test file gets the same instant feedback a CI runner would, with no cold start and no full rebuild.

3. A Module System

Modules are reusable pipeline components that can be published and shared:

# Use a published module
dagger call -m github.com/purpleclay/daggerverse/golang@v0.5.0 build --source=.

# Or reference your own
dagger call -m ./.dagger build --source=.
Enter fullscreen mode Exit fullscreen mode

Modules have typed interfaces, documentation, and versioning. They compose:

@function
async def deploy(
    self,
    source: Annotated[dagger.Directory, Doc("Application source")],
) -> str:
    # Authenticate to GCP
    gcp = dag.gcp_auth().from_service_account_key(key=sa_key, project=project)

    # Publish to Artifact Registry
    image_uri = await dag.gcp_artifact_registry(gcp).publish(
        container=await self.build(source),
        repository="my-repo", image="my-app", tag="latest",
    )

    # Deploy to Cloud Run
    return await dag.gcp_cloud_run(gcp).service().deploy(
        name="my-service", image=image_uri, region="us-central1",
    )
Enter fullscreen mode Exit fullscreen mode

This is the Platform Engineering pattern applied to CI/CD. A platform team builds and maintains modules (gcp-auth, artifact-registry, cloud-run) the same way they'd maintain Terraform modules or Helm charts. Developers consume them as typed dependencies. They don't need to know how GCP authentication works internally; they call dag.gcp_auth() and get an authenticated container. Self-service, with guardrails.


Putting It Into Practice

Enough theory. For the rest of this series, we're going to simulate, as closely as we can, a production deployment. Not a toy example with a single endpoint and a hello world container, but a realistic two-component product with a frontend, a backend, authentication between them, and deployment to a real cloud provider.

The goal is a deep dive: from writing the first pipeline function on your laptop, all the way to a multi-agent CI/CD system that reviews, builds, and deploys both apps from a single natural language command.

Prerequisites

To follow along, you'll need three things set up:

  1. Dagger CLI: Install the Dagger engine and CLI on your machine. This is what runs your pipelines locally and in CI.
    Install Dagger

  2. A GitHub account: We'll use GitHub as our code server and CI runner (via GitHub Actions). The companion repository with all example code is public.
    Create a GitHub account if you don't have one

  3. A Google Cloud account with a project: GCP is our deployment target throughout the series. You'll need a project with billing enabled (the free tier is sufficient for everything we do here).
    Create a GCP account
    Create a GCP project

Required Google APIs — Enable these APIs on your GCP project before deploying. You can do it from the API Library or via gcloud:

gcloud services enable \
  run.googleapis.com \
  artifactregistry.googleapis.com \
  firebasehosting.googleapis.com \
  firebase.googleapis.com \
  iam.googleapis.com \
  cloudresourcemanager.googleapis.com
  • Cloud Run API (run.googleapis.com) — deploy the FastAPI backend
  • Artifact Registry API (artifactregistry.googleapis.com) — store container images
  • Firebase Hosting API (firebasehosting.googleapis.com) — host the Angular frontend
  • Firebase API (firebase.googleapis.com) — Firebase Anonymous Auth for the frontend
  • IAM API (iam.googleapis.com) — manage service accounts and OIDC federation
  • Cloud Resource Manager API (cloudresourcemanager.googleapis.com) — project-level operations

You'll also want Docker installed locally (Dagger uses it under the hood), and Python 3.12+ and Node.js 20+ for the example apps themselves.

The Product: Angular Frontend + FastAPI Backend

Our product is two apps that talk to each other:

  • Angular 21 SPA on Firebase Hosting: displays items from the API, authenticates via Firebase Anonymous Auth
  • FastAPI REST API on Cloud Run: serves items, validates Firebase ID tokens from the frontend

The frontend authenticates anonymously with Firebase, gets an ID token, and sends it as a Bearer header to the backend. The backend validates the token using google-auth. This is a realistic pattern for any Firebase + Cloud Run stack.

Initializing the Module

Clone the companion repository:

git clone https://github.com/telchak/dagger-ci-demo.git
cd dagger-ci-demo
Enter fullscreen mode Exit fullscreen mode

We have a monorepo with backend/ and frontend/. Rather than creating a separate Dagger module for each app, we'll initialize a single module at the repository root. This gives us one place to define pipeline functions for the entire product:

dagger init --sdk=python --name=dagger-ci-demo
Enter fullscreen mode Exit fullscreen mode

This creates a .dagger/ directory (Dagger's convention, like .github/ for GitHub Actions) containing a dagger.json manifest, a pyproject.toml, and a source scaffold where we'll write our pipeline functions.

Key Concepts: Types, Functions, and Chaining

Before looking at the code, let's clarify three core Dagger concepts (docs):

Types. The Dagger API provides specialized types that represent CI/CD primitives. The ones we'll use most:

  • Container: an OCI container image with its filesystem and configuration
  • Directory: a filesystem tree (your source code, build output, etc.)
  • File: a single file artifact
  • Secret: sensitive data (API keys, credentials) that Dagger never logs or caches
  • Service: a running network service (for integration testing, local dev, etc.)

These types are the building blocks. They replace the implicit, untyped operations of YAML pipelines with explicit, typed abstractions.


Image generated with Google's Gemini Imagen "Nano Banana Pro"

Functions. A Dagger module exposes functions, decorated with @function, that accept these types as parameters and return them as outputs. Functions are the unit of work. Each one is a self-contained operation: build a container, run tests, publish an image.

Chaining. Each Dagger type comes with its own methods, and these methods return the same (or related) types. This enables chaining: you pass the output of one operation directly into the next, building up a pipeline step by step. When you write dag.container().from_("python:3.12-slim").with_exec(["pip", "install", "..."]), you're chaining three operations on a Container type. Dagger evaluates this chain lazily: nothing executes until a terminal operation (like .stdout() or .publish()) triggers the full graph.

The Pipeline: Backend and Frontend

Here's what we'll write at .dagger/src/dagger_ci_demo/main.py, a single class with functions for both apps:

"""CI/CD pipeline for the dagger-ci-demo product."""

from typing import Annotated

import dagger
from dagger import Doc, dag, function, object_type


# @object_type marks this class as a Dagger module.
# Its public methods become callable pipeline functions.
@object_type
class DaggerCiDemo:
    """Pipeline for the Angular + FastAPI product."""

    # --- Backend (FastAPI) ---

    @function
    async def build_backend(
        self,
        source: Annotated[dagger.Directory, Doc("Backend source directory")],
    ) -> dagger.Container:
        """Build the FastAPI backend container.

        Returns a Container — the core Dagger type representing an OCI image.
        The caller can publish it, export it, or chain further operations on it.
        """
        return (
            dag.container()
            # Start from a base image — returns a Container
            .from_("python:3.12-slim")
            # Set the working directory inside the container
            .with_workdir("/app")
            # Copy only requirements.txt first — this layer is cached
            # as long as requirements.txt doesn't change, even if
            # source code changes. This is the Docker cache optimization
            # pattern, expressed as a function chain.
            .with_file("/app/requirements.txt", source.file("requirements.txt"))
            # Install dependencies — cached if requirements.txt unchanged
            .with_exec(["pip", "install", "--no-cache-dir", "-r", "requirements.txt"])
            # Copy application source — only this layer and below re-run
            # when source code changes
            .with_directory("/app/src", source.directory("src"))
            # Configure the container for Cloud Run
            .with_env_variable("PORT", "8080")
            .with_exposed_port(8080)
            .with_entrypoint([
                "uvicorn", "src.main:app",
                "--host", "0.0.0.0", "--port", "8080", "--workers", "2",
            ])
        )

    @function
    async def test_backend(
        self,
        source: Annotated[dagger.Directory, Doc("Backend source directory")],
    ) -> str:
        """Run the backend test suite.

        Returns a string (stdout) — a terminal type that triggers execution.
        Everything before .stdout() is lazily evaluated; Dagger only runs
        the container when it needs the output.
        """
        return await (
            dag.container()
            .from_("python:3.12-slim")
            .with_workdir("/app")
            # Mount the full source directory (tests need everything)
            .with_directory("/app", source)
            .with_exec(["pip", "install", "--no-cache-dir", "-r", "requirements.txt"])
            # Run pytest — if this command fails, the function raises an error
            .with_exec(["pytest", "-v", "tests/"])
            # .stdout() is the terminal operation: it triggers execution
            # of the entire chain and returns the command's output as a string
            .stdout()
        )

    # --- Frontend (Angular) ---

    @function
    async def build_frontend(
        self,
        source: Annotated[dagger.Directory, Doc("Frontend source directory")],
    ) -> dagger.Directory:
        """Build the Angular frontend for production.

        Returns a Directory — the build output (dist/) that can be
        deployed to Firebase Hosting, exported to the host filesystem,
        or passed to another function.
        """
        return (
            dag.container()
            .from_("node:20-slim")
            .with_workdir("/app")
            # Copy package files first for dependency caching
            .with_file("/app/package.json", source.file("package.json"))
            .with_file("/app/package-lock.json", source.file("package-lock.json"))
            # npm ci installs exact versions from lockfile — cached if
            # package files unchanged
            .with_exec(["npm", "ci"])
            # Copy the rest of the source
            .with_directory("/app/src", source.directory("src"))
            .with_file("/app/angular.json", source.file("angular.json"))
            .with_file("/app/tsconfig.json", source.file("tsconfig.json"))
            .with_file("/app/tsconfig.app.json", source.file("tsconfig.app.json"))
            # Build for production — Angular CLI outputs to dist/
            .with_exec(["npx", "ng", "build", "--configuration=production"])
            # Extract just the build output directory.
            # .directory() returns a Directory type — stripping away the
            # container and keeping only the filesystem artifact we need.
            .directory("/app/dist/angular-frontend/browser")
        )

    @function
    async def test_frontend(
        self,
        source: Annotated[dagger.Directory, Doc("Frontend source directory")],
    ) -> str:
        """Run the Angular test suite."""
        return await (
            dag.container()
            .from_("node:20-slim")
            .with_workdir("/app")
            .with_directory("/app", source)
            .with_exec(["npm", "ci"])
            .with_exec(["npx", "ng", "test"])
            .stdout()
        )
Enter fullscreen mode Exit fullscreen mode

Let's unpack what's going on here.

The class DaggerCiDemo is a module, a collection of functions packaged together. The @object_type decorator registers it with the Dagger engine, and every @function method becomes a callable operation, both from the CLI and from other modules.

Each function takes a Directory as input, Dagger's type for a filesystem tree. When you pass --source=./backend from the CLI, Dagger packages that directory and hands it to the function as a typed Directory object. No string paths, no glob patterns, just a real, content-addressed filesystem reference.

The function bodies are chains of operations on Dagger types. dag.container().from_("python:3.12-slim") creates a Container from a base image. Each .with_*() call returns a new Container with the modification applied. Nothing executes yet. Dagger builds a directed acyclic graph (DAG) of operations. Execution only happens when a terminal operation like .stdout() or .publish() forces the engine to resolve the graph.

This is why caching works so well. Dagger hashes each node in the graph based on its inputs. If requirements.txt hasn't changed, the pip install step is a cache hit, instantly. If only a source file changed, only the layers after .with_directory("/app/src", ...) re-execute. Same content-addressed logic, on your laptop and in CI.

Notice how build_backend returns a Container while build_frontend returns a Directory. The return type determines what the caller can do next. A Container can be published to a registry, started as a service, or chained further. A Directory can be deployed to static hosting, exported, or mounted into another container. Types guide composition. They make it clear what flows where.

Run It Locally

With Dagger installed (see prerequisites above), let's explore the module from the repository root.

Discovering Functions

Before running anything, you can inspect what the module exposes:

dagger functions
Enter fullscreen mode Exit fullscreen mode
Name             Description
build-backend    Build the FastAPI backend container.
build-frontend   Build the Angular frontend for production.
test-backend     Run the backend test suite.
test-frontend    Run the Angular test suite.
Enter fullscreen mode Exit fullscreen mode

This is auto-generated. Dagger reads the @function decorators, their docstrings, and the Doc() annotations on parameters, and produces CLI documentation from them, in any SDK language. The same metadata that describes a function in Python generates the same CLI interface as it would in Go or TypeScript.

You can drill into any function with --help:

dagger call build-backend --help
Enter fullscreen mode Exit fullscreen mode
Build the FastAPI backend container.

Returns a Container — the core Dagger type representing an OCI image.
The caller can publish it, export it, or chain further operations on it.

USAGE
  dagger call build-backend [arguments]

ARGUMENTS
  --source Directory   Backend source directory [required]
Enter fullscreen mode Exit fullscreen mode

Everything here (the description, the argument name, the type, the [required] marker) comes directly from your code. The Doc("Backend source directory") annotation you wrote in the Annotated[dagger.Directory, Doc("...")] parameter becomes the argument's help text. The function's docstring becomes the command's description. There's no separate documentation to write or maintain. The code is the documentation, and Dagger keeps it in sync for both the CLI and the SDK.

This is what makes self-service work. A developer who has never seen this module before can run dagger functions, pick the operation they need, check its arguments with --help, and run it, without reading source code or asking the platform team.

Running Functions

# Test the backend
dagger call test-backend --source=./backend

# Build the backend container
dagger call build-backend --source=./backend

# Test the frontend
dagger call test-frontend --source=./frontend

# Build the frontend (returns the dist/ directory)
dagger call build-frontend --source=./frontend
Enter fullscreen mode Exit fullscreen mode

A few things to notice about the CLI:

  • dagger call is the universal entry point for running any function. It works the same way whether you're calling a local module or a remote one (dagger call -m github.com/...).
  • Function names are converted from Python's snake_case to CLI kebab-case automatically (build_backendbuild-backend).
  • --source is not a built-in Dagger flag. It's the parameter we defined with Annotated[dagger.Directory, Doc("...")]. Dagger turns every function parameter into a CLI flag, typed accordingly. A dagger.Directory parameter accepts a local path; a dagger.Secret would accept env:MY_SECRET or file:./key.json.
  • Return types determine what the CLI shows. test_backend returns a str, so Dagger prints the stdout. build_backend returns a Container, so Dagger shows the image digest. build_frontend returns a Directory, which you can export with dagger call build-frontend --source=./frontend export --path=./dist.

Same functions, same commands, for both apps. A developer debugging a build failure doesn't push a commit and wait. They run the same command on their laptop, inspect the output, and fix the issue with the same tools they use for application code. CI/CD is part of the software factory.

From Laptop to CI: The Same Commands in GitHub Actions

We've been running everything locally. Now let's prove the claim: the same code runs in CI without any changes. Create .github/workflows/ci.yml:

name: CI

on:
  push:
    branches: [main]
  pull_request:

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v6

      - name: Test Backend
        uses: dagger/dagger-for-github@v8.4.1
        with:
          version: "0.20.3"
          verb: call
          args: test-backend --source=./backend

      - name: Test Frontend
        uses: dagger/dagger-for-github@v8.4.1
        with:
          version: "0.20.3"
          verb: call
          args: test-frontend --source=./frontend
Enter fullscreen mode Exit fullscreen mode

That's it. No Dockerfile to maintain. No shell scripts gluing steps together. No CI-specific logic. The dagger/dagger-for-github action installs the Dagger CLI and engine, and runs the exact same dagger call commands we just ran locally. The pipeline code doesn't know, or care, where it's running.

This is the minimal, working CI setup. Push it, open a pull request, and watch the same tests you just ran on your laptop execute in GitHub Actions. Same functions, same caching, same results.


Why Now

Three trends make this approach timely:

Platform Engineering is maturing. The core promise of Platform Engineering is developer self-service: give teams golden paths so they can ship without waiting for specialists. CI/CD modules are the missing piece in that story: reusable, versioned, self-documenting abstractions that reduce cognitive load the same way a Terraform module does. When CI/CD is software, it fits naturally into the Internal Developer Platform.

AI agents need programmatic interfaces. When your pipeline is real code with typed functions, AI agents can interact with it through tool use. YAML pipelines are opaque to agents. Typed Dagger functions are not. (We'll explore this in Part 4.)

Multi-cloud is the norm. A pipeline that deploys to GCP today can deploy to AWS tomorrow with a module swap. No vendor lock-in at the CI layer. The platform team maintains the modules; developers maintain their code.


What's Next

We now have a working CI pipeline, real code running both locally and in GitHub Actions with the same commands. But this minimal setup has limitations: every job starts with a cold cache, builds run on GitHub's shared 2-vCPU runners, and there's no persistent state between runs.

In Part 2, we'll tackle the infrastructure side: caching strategies that cut build times by 5x, managed runners with persistent NVMe storage, and self-hosted Kubernetes runners with shared Dagger engines. Three approaches, each with different tradeoffs between simplicity and control.

The pipeline code won't change. Only where it runs will.


Up Next: Part 2: Decoupling Pipelines from Infrastructure


This is Part 1 of a 4-part series. Follow for updates.

Tags: #cicd #dagger #platform-engineering #python #angular

Top comments (0)