DEV Community

Alexis
Alexis

Posted on

CloudFlare D1 DevOps from .NET: Test Seeding, Data Sync, and Backups

TL;DR: The Cloudflare.NET SDK lets you manage D1 databases from C# for DevOps workflows: seed test data, sync reference tables, and backup to R2 - all without installing Node.js or Wrangler.


D1 is Cloudflare's serverless SQLite database, designed to run at the edge alongside Workers. The typical workflow involves Wrangler CLI for database management - but Wrangler requires Node.js.

If you're a .NET shop, you probably don't want Node.js in your CI/CD pipelines just to run database migrations.

The Cloudflare.NET SDK provides a D1 client that lets you manage databases directly from C#. This article covers three practical DevOps scenarios where this is genuinely useful.


Why a Typed SDK?

A typed SDK gives you:

  • IntelliSense everywhere - Method signatures, parameter types, and return types are discoverable in your IDE
  • Compile-time safety - Catch typos and type mismatches before runtime
  • Comprehensive XML documentation - Every method, parameter, and model is documented inline
  • Online API reference - Searchable documentation with examples

Compare this to shelling out to wrangler d1 execute and parsing JSON output.


When to Use the D1 HTTP API

Let's be clear about what the D1 HTTP API is designed for. From Cloudflare's documentation:

The D1 HTTP API is best suited for administrative use, such as running migrations, importing data, or managing database schema.

The HTTP API is not for real-time queries from your backend. Latency would be unacceptable. Your Workers should query D1 directly using the native bindings.

However, there are legitimate scenarios where calling D1 from outside Cloudflare makes sense:

Scenario Why It Works
Test data seeding Runs once before tests, latency doesn't matter
Reference data sync Scheduled job pushing authoritative data to the edge
Database backups Periodic export, latency irrelevant
CI/CD migrations Schema changes during deployment
Bulk data import Initial load or periodic refresh

All of these are infrequent operations where the HTTP round-trip is acceptable.


Scenario 1: Test Data Seeding

The Problem

You have integration tests that verify your Workers interact correctly with D1. Before each test run, you need to:

  1. Reset the database to a known state
  2. Insert specific test fixtures
  3. Run tests against real D1 (not mocks)

With Wrangler, you'd shell out to npx wrangler d1 execute - which requires Node.js in your test environment.

The Solution

Seed D1 directly from your xUnit test setup using the D1 API:

public class OrdersWorkerTests : IAsyncLifetime
{
    private readonly ICloudflareApiClient _cf;
    private readonly string _databaseId;

    public OrdersWorkerTests()
    {
        _cf = CreateCloudflareClient();
        _databaseId = Environment.GetEnvironmentVariable("D1_TEST_DATABASE_ID")!;
    }

    public async Task InitializeAsync()
    {
        // Reset to known state
        await _cf.Accounts.D1.QueryAsync(_databaseId, "DELETE FROM orders");
        await _cf.Accounts.D1.QueryAsync(_databaseId, "DELETE FROM customers");

        // Seed test fixtures
        await _cf.Accounts.D1.QueryAsync(_databaseId, """
            INSERT INTO customers (id, name, email) VALUES
            (1, 'Test Customer', 'test@example.com'),
            (2, 'Another Customer', 'another@example.com')
            """);

        await _cf.Accounts.D1.QueryAsync(_databaseId, """
            INSERT INTO orders (id, customer_id, total, status) VALUES
            (100, 1, 99.99, 'pending'),
            (101, 1, 149.99, 'shipped'),
            (102, 2, 29.99, 'delivered')
            """);
    }

    public Task DisposeAsync() => Task.CompletedTask;

    [Fact]
    public async Task GetPendingOrders_ReturnsCorrectCount()
    {
        // Call your Worker endpoint that queries D1
        var response = await _httpClient.GetAsync("/api/orders?status=pending");

        var orders = await response.Content.ReadFromJsonAsync<Order[]>();
        Assert.Single(orders);
        Assert.Equal(100, orders[0].Id);
    }
}
Enter fullscreen mode Exit fullscreen mode

Parameterized Queries

For dynamic test data, use parameterized queries to prevent SQL injection:

await _cf.Accounts.D1.QueryAsync(_databaseId,
    "INSERT INTO orders (customer_id, total) VALUES (?, ?)",
    @params: [customerId, orderTotal]);
Enter fullscreen mode Exit fullscreen mode

Typed Query Results

Query and deserialize results directly:

var results = await _cf.Accounts.D1.QueryAsync<Order>(_databaseId,
    "SELECT id, customer_id, total, status FROM orders WHERE status = ?",
    @params: ["pending"]);

foreach (var order in results[0].Results)
{
    Console.WriteLine($"Order {order.Id}: ${order.Total}");
}
Enter fullscreen mode Exit fullscreen mode

Scenario 2: Reference Data Synchronization

The Problem

Your D1 database contains lookup tables that are mastered in your .NET backend's SQL Server:

  • Countries and regions
  • Product categories
  • Tax rates by jurisdiction
  • Feature flags

You need to periodically sync this authoritative data from your backend to D1. The data flows from your backend to the edge - the opposite direction of a typical read query.

The Solution

A scheduled job that pushes reference data to D1 using parameterized queries:

public class D1SyncService(
    ICloudflareApiClient cloudflare,
    IDbConnection sqlServer,
    ILogger<D1SyncService> logger)
{
    private readonly string _d1DatabaseId = "your-d1-database-id";

    public async Task SyncCountriesAsync()
    {
        // Fetch authoritative data from SQL Server
        var countries = await sqlServer.QueryAsync<Country>(
            "SELECT code, name, tax_rate FROM countries WHERE active = 1");

        logger.LogInformation("Syncing {Count} countries to D1", countries.Count());

        // Clear and repopulate D1
        await cloudflare.Accounts.D1.QueryAsync(_d1DatabaseId,
            "DELETE FROM countries");

        foreach (var batch in countries.Chunk(100))
        {
            var values = string.Join(",",
                batch.Select(c => $"('{c.Code}', '{c.Name}', {c.TaxRate})"));

            await cloudflare.Accounts.D1.QueryAsync(_d1DatabaseId,
                $"INSERT INTO countries (code, name, tax_rate) VALUES {values}");
        }

        logger.LogInformation("Country sync complete");
    }
}
Enter fullscreen mode Exit fullscreen mode

Register as a Hosted Service

Run the sync on a schedule:

public class D1SyncBackgroundService(
    D1SyncService syncService,
    ILogger<D1SyncBackgroundService> logger) : BackgroundService
{
    protected override async Task ExecuteAsync(CancellationToken stoppingToken)
    {
        while (!stoppingToken.IsCancellationRequested)
        {
            try
            {
                await syncService.SyncCountriesAsync();
                await syncService.SyncCategoriesAsync();
                await syncService.SyncFeatureFlagsAsync();
            }
            catch (Exception ex)
            {
                logger.LogError(ex, "D1 sync failed");
            }

            // Run every hour
            await Task.Delay(TimeSpan.FromHours(1), stoppingToken);
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Scenario 3: Backup D1 to R2

The Problem

You want automated backups of your D1 databases stored in R2, using your existing .NET infrastructure. No Node.js required.

For a deep dive on the R2 client, see Cloudflare R2 in .NET Without the AWS SDK Headaches.

The Solution

Export D1 using the export API and upload to R2:

public class D1BackupService(
    ICloudflareApiClient cloudflare,
    IR2Client r2,
    ILogger<D1BackupService> logger)
{
    public async Task BackupDatabaseAsync(string databaseId, string databaseName)
    {
        logger.LogInformation("Starting backup of {Database}", databaseName);

        // Start the export
        var exportResult = await cloudflare.Accounts.D1.StartExportAsync(databaseId);

        // Poll until complete
        D1ExportResponse status;
        do
        {
            await Task.Delay(TimeSpan.FromSeconds(2));
            status = await cloudflare.Accounts.D1.PollExportAsync(
                databaseId,
                exportResult.AtBookmark!);
        }
        while (status.Status == "active");

        if (status.Status != "complete")
        {
            throw new InvalidOperationException($"Export failed: {status.Error}");
        }

        // Download the SQL dump
        using var httpClient = new HttpClient();
        var sqlDump = await httpClient.GetByteArrayAsync(status.Result!.SignedUrl!);

        // Upload to R2 with timestamp
        var backupKey = $"backups/{databaseName}/{DateTime.UtcNow:yyyy-MM-dd-HHmmss}.sql";

        using var stream = new MemoryStream(sqlDump);
        await r2.UploadAsync("db-backups", backupKey, stream);

        logger.LogInformation(
            "Backup complete: {Key} ({Size:N0} bytes)",
            backupKey,
            sqlDump.Length);
    }
}
Enter fullscreen mode Exit fullscreen mode

Backup All Databases

Enumerate and backup all D1 databases in your account:

public async Task BackupAllDatabasesAsync()
{
    await foreach (var database in cloudflare.Accounts.D1.ListAllAsync())
    {
        try
        {
            await BackupDatabaseAsync(database.Uuid, database.Name);
        }
        catch (Exception ex)
        {
            logger.LogError(ex, "Failed to backup {Database}", database.Name);
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Retention Policy

Combine with R2 lifecycle rules to automatically expire old backups:

// Set retention policy on the backup bucket
await cloudflare.Accounts.Buckets.SetLifecycleAsync("db-backups", new BucketLifecyclePolicy(
    Rules:
    [
        new LifecycleRule(
            Id: "expire-old-backups",
            Enabled: true,
            DeleteObjectsTransition: new DeleteObjectsTransition(
                Condition: LifecycleCondition.AfterDays(30)
            )
        )
    ]
));
Enter fullscreen mode Exit fullscreen mode

Installation

dotnet add package Cloudflare.NET.Api
dotnet add package Cloudflare.NET.R2  # Only needed for R2 backup scenario
Enter fullscreen mode Exit fullscreen mode

Configuration

See the Getting Started guide for full setup instructions.

appsettings.json

{
  "Cloudflare": {
    "ApiToken": "your-api-token",
    "AccountId": "your-account-id"
  },
  "R2": {
    "AccessKeyId": "your-r2-access-key",
    "SecretAccessKey": "your-r2-secret-key"
  }
}
Enter fullscreen mode Exit fullscreen mode

Required Permissions

Your API token needs the D1 permission at Account level:

Operation Permission Level
List/Get databases D1: Read
Create/Delete databases D1: Write
Execute queries D1: Write
Export/Import D1: Write

Dependency Injection Setup

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddCloudflareApiClient(builder.Configuration);
builder.Services.AddCloudflareR2Client(builder.Configuration); // For backup scenario

builder.Services.AddScoped<D1SyncService>();
builder.Services.AddHostedService<D1SyncBackgroundService>();
Enter fullscreen mode Exit fullscreen mode

CI/CD Integration

GitHub Actions Example

Run D1 migrations from your .NET CI pipeline:

name: Deploy

on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup .NET
        uses: actions/setup-dotnet@v4
        with:
          dotnet-version: '9.0.x'

      - name: Run D1 Migrations
        env:
          Cloudflare__ApiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
          Cloudflare__AccountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
        run: dotnet run --project tools/D1Migrations

      # No Node.js required!
Enter fullscreen mode Exit fullscreen mode

Migration Tool Example

A simple console app for running migrations:

var builder = Host.CreateApplicationBuilder(args);
builder.Services.AddCloudflareApiClient(builder.Configuration);

var host = builder.Build();
var cf = host.Services.GetRequiredService<ICloudflareApiClient>();

var databaseId = Environment.GetEnvironmentVariable("D1_DATABASE_ID")!;

// Run migrations in order
var migrations = Directory.GetFiles("migrations", "*.sql").Order();

foreach (var migration in migrations)
{
    Console.WriteLine($"Running {Path.GetFileName(migration)}...");
    var sql = await File.ReadAllTextAsync(migration);
    await cf.Accounts.D1.QueryAsync(databaseId, sql);
}

Console.WriteLine("Migrations complete.");
Enter fullscreen mode Exit fullscreen mode

Error Handling

The SDK provides structured exceptions for D1 operations:

try
{
    await cf.Accounts.D1.QueryAsync(databaseId, "SELECT * FROM nonexistent");
}
catch (CloudflareApiException ex) when (ex.Errors.Any(e => e.Code == 7500))
{
    // D1 query error
    Console.WriteLine($"Query failed: {ex.Errors.First().Message}");
}
catch (CloudflareApiException ex)
{
    // Other API errors
    foreach (var error in ex.Errors)
    {
        Console.WriteLine($"[{error.Code}] {error.Message}");
    }
}
Enter fullscreen mode Exit fullscreen mode

Summary

The D1 HTTP API isn't for real-time queries - use native Worker bindings for that. But for DevOps workflows, the Cloudflare.NET SDK provides a clean way to manage D1 from C#:

Scenario What You Get
Test Data Seeding Reset and populate D1 from xUnit setup
Reference Data Sync Push authoritative data from your backend to the edge
Automated Backups Export D1 to R2 without Node.js
CI/CD Migrations Run schema changes from .NET pipelines

The key benefit: no Node.js or Wrangler required. If your team is .NET-first, you can manage D1 using your existing toolchain.

Get started:

dotnet add package Cloudflare.NET.Api
Enter fullscreen mode Exit fullscreen mode

Learn more:


Cloudflare.NET is an open-source, community-maintained SDK. Contributions are welcome!

Top comments (0)