<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Jerrett Davis</title>
    <description>The latest articles on DEV Community by Jerrett Davis (@jerrettdavis).</description>
    <link>https://dev.to/jerrettdavis</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jerrettdavis"/>
    <language>en</language>
    <item>
      <title>Announcing TinyBDD: Fluent, Executable Scenarios for .NET</title>
      <dc:creator>Jerrett Davis</dc:creator>
      <pubDate>Sat, 27 Dec 2025 19:29:16 +0000</pubDate>
      <link>https://dev.to/jerrettdavis/announcing-tinybdd-fluent-executable-scenarios-for-net-21k4</link>
      <guid>https://dev.to/jerrettdavis/announcing-tinybdd-fluent-executable-scenarios-for-net-21k4</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This article was originally published on &lt;a href="https://jerrettdavis.com/blog/posts/tinybdd" rel="noopener noreferrer"&gt;my blog&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h1&gt;
  
  
  Announcing TinyBDD: Fluent, Executable Scenarios for .NET
&lt;/h1&gt;

&lt;p&gt;👉 &lt;a href="https://github.com/jerrettdavis/tinybdd" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; · &lt;a href="https://www.nuget.org/packages/TinyBDD" rel="noopener noreferrer"&gt;NuGet&lt;/a&gt; · &lt;a href="https://jerrettdavis.github.io/TinyBDD/" rel="noopener noreferrer"&gt;Docs&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;What if the shortest path from "we need this" to "it works in prod" was just a single fluent line of code?&lt;/p&gt;

&lt;p&gt;TinyBDD is my attempt to make that path real. It's a lightweight .NET library that lets you write tests in a fluent, Gherkin-ish style—tests that read like acceptance criteria but execute like unit tests. The goal is not ceremony, but clarity: a shared, human-parsable DSL that can span from domain rules to browser automation without losing intent.  &lt;/p&gt;

&lt;p&gt;This post is the practical follow-up to my earlier &lt;a href="https://jerrettdavis.com/blog/posts/making-the-business-write-your-tests-with-bdd" rel="noopener noreferrer"&gt;essay on BDD&lt;/a&gt;. There, I dug into the &lt;em&gt;why&lt;/em&gt;. Here, we'll focus on the &lt;em&gt;how&lt;/em&gt;: how to go from acceptance criteria to running tests in minutes, how to use Given/When/Then to model even the smallest units, how to orchestrate full end-to-end flows with Playwright, and how writing this way naturally nudges your architecture toward SOLID and composable design.&lt;/p&gt;




&lt;h2&gt;
  
  
  From acceptance criteria to running tests in minutes
&lt;/h2&gt;

&lt;p&gt;Every team has seen a story like this written in Jira or Confluence:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
Scenario: Gold member gets free shipping
Given the customer is a "gold" member
And they have a cart totaling $12.00
When they checkout with standard shipping
Then the shipping cost is $0.00
And the order total is $12.00

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With TinyBDD, you don't need separate &lt;code&gt;.feature&lt;/code&gt; files unless you want them. You can capture that same intent directly in your test framework, keeping the semantics without the tooling overhead:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;Given&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"a gold customer with a $12 cart"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;Cart&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Customer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Gold&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="m"&gt;12.00m&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
     &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;When&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"they choose standard shipping"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cart&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;cart&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Checkout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Shipping&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Standard&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
     &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"shipping is free"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;order&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;order&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ShippingTotal&lt;/span&gt; &lt;span class="p"&gt;==&lt;/span&gt; &lt;span class="m"&gt;0.00m&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;             &lt;span class="c1"&gt;// Pass/Fail with booleans&lt;/span&gt;
     &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;And&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"order total is $12.00"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;order&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;Expect&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;For&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;order&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Total&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;ToBe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="m"&gt;12.00m&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="c1"&gt;// Or Assertions&lt;/span&gt;
     &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AssertPassed&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;br&gt;
`&lt;/p&gt;

&lt;p&gt;The keywords map one-to-one with the business story. Each step is explicit and composable, and the whole chain is easy to read—even for someone outside the dev team. Because the language matches what stakeholders already use, the test itself becomes a living contract.&lt;/p&gt;




&lt;h2&gt;
  
  
  Unit tests that read like behavior
&lt;/h2&gt;

&lt;p&gt;Behavior-driven style isn't just for top-level acceptance tests. It works equally well for small pieces of logic: a pure function, a discount rule, a transformer. By expressing them as Given/When/Then, you get readability—tiny scenarios that explain the intent before diving into implementation detail—and design pressure, because the format gently encourages pure, composable functions.&lt;/p&gt;

&lt;p&gt;Example: a simple discount calculation.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;csharp&lt;br&gt;
await Given("a silver customer with $100 cart", () =&amp;gt; (Tier: "silver", Amount: 100m))&lt;br&gt;
     .When("discount is applied", x =&amp;gt; Discounts.Apply(x.Tier, x.Amount))&lt;br&gt;
     .Then("result is $95", result =&amp;gt; result == 95m)&lt;br&gt;
     .AssertPassed();&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Even at this scale, the benefits are obvious. You isolate the decision logic, assert outcomes in plain language, and end up with code that composes neatly into bigger flows later.&lt;/p&gt;




&lt;h2&gt;
  
  
  End-to-end UI tests with Playwright
&lt;/h2&gt;

&lt;p&gt;TinyBDD also works at the other end of the spectrum: full-stack, end-to-end tests. Here, the key is keeping steps thin and expressive while pushing implementation detail into helpers like page objects or service wrappers. That way, the scenario text stays stable even if the UI shifts underneath.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;csharp&lt;br&gt;
await Given("a new browser and logged-in gold user", async () =&amp;gt;&lt;br&gt;
{&lt;br&gt;
    var pw = await PlaywrightFactory.LaunchAsync();&lt;br&gt;
    var page = await pw.NewPageAsync();&lt;br&gt;
    await AuthSteps.LoginAsGoldAsync(page);&lt;br&gt;
    return page;&lt;br&gt;
})&lt;br&gt;
.When("user adds a $12 item to cart", async page =&amp;gt;&lt;br&gt;
{&lt;br&gt;
    await CatalogSteps.AddItemAsync(page, "SKU-123", 12.00m);&lt;br&gt;
    return page;&lt;br&gt;
})&lt;br&gt;
.And("proceeds to checkout with standard shipping", CheckoutSteps.StandardAsync)&lt;br&gt;
.Then("shipping is free", async page =&amp;gt;&lt;br&gt;
{&lt;br&gt;
    var shipping = await CartSteps.ReadShippingAsync(page);&lt;br&gt;
    return shipping == 0.00m;&lt;br&gt;
})&lt;br&gt;
.And("order total is $12.00", async page =&amp;gt;&lt;br&gt;
{&lt;br&gt;
    var total = await CartSteps.ReadTotalAsync(page);&lt;br&gt;
    return total == 12.00m;&lt;br&gt;
})&lt;br&gt;
.AssertPassed();&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Scenarios like this are readable enough for a stakeholder to skim, while still giving engineers the control they need under the hood. Stable wording, deterministic helpers, and tagging (&lt;code&gt;smoke&lt;/code&gt;, &lt;code&gt;ui&lt;/code&gt;, &lt;code&gt;checkout&lt;/code&gt;) all contribute to making suites like this maintainable in real CI pipelines.&lt;/p&gt;




&lt;h3&gt;
  
  
  Patterns that keep this maintainable
&lt;/h3&gt;

&lt;p&gt;The trick to making end-to-end scenarios sustainable is resisting the temptation to let your steps do all the heavy lifting. The step chain should stay thin and intention-revealing, while the real mechanics live in helpers: page objects, domain services, or test utilities. This keeps the scenario text stable even as the implementation evolves. A good rule of thumb is that a non-technical stakeholder should be able to scan the steps and nod along without ever seeing the helper code. Deterministic helpers—free from hidden global state—are key to repeatable results. And once you have a handful of scenarios, you'll want to tag them (&lt;code&gt;smoke&lt;/code&gt;, &lt;code&gt;ui&lt;/code&gt;, &lt;code&gt;checkout&lt;/code&gt;, etc.) so that CI pipelines can run fast slices for quick feedback and broader sweeps when confidence matters most.&lt;/p&gt;




&lt;h3&gt;
  
  
  Let tests guide your design
&lt;/h3&gt;

&lt;p&gt;When you write tests in a behavior-first style, architectural friction surfaces quickly. A step that requires half a dozen parameters is rarely a coincidence—it usually means your modules are too tightly coupled. Repeating the same tedious setup across multiple scenarios suggests the absence of a proper abstraction. And if you struggle to phrase a step cleanly, the problem may not be the test at all, but the clarity of your domain language.&lt;/p&gt;

&lt;p&gt;These moments of friction are signals. Often, the fix is to extract a pure function from a messy edge, create a port or adapter to decouple infrastructure from business rules, or split a workflow into smaller seams that deserve their own scenarios. In other words: the pressure you feel in writing the test is your design telling you what it wants to become.&lt;/p&gt;




&lt;h3&gt;
  
  
  BDD and the drift toward SOLID and functional design
&lt;/h3&gt;

&lt;p&gt;Consistently writing scenarios has a shaping effect on code. Steps that do one clear thing align with the Single Responsibility Principle. The ability to add new scenarios without editing existing ones echoes the Open/Closed Principle. And abstractions that are narrow, well-defined, and swappable make substituting fakes and stubs trivial, pushing you toward Liskov, ISP, and DIP almost by default.&lt;/p&gt;

&lt;p&gt;The same is true for functional composition. Pure functions naturally slide into Given/When/Then flows. Side effects are easiest to reason about when pushed to the edges—fetching in a Given, transforming in a When, and observing in a Then. And when steps are small and named, they read like a pipeline instead of a mess of conditionals. By following the test style, you often find yourself following the design style too.&lt;/p&gt;




&lt;h3&gt;
  
  
  A practical way to start tomorrow
&lt;/h3&gt;

&lt;p&gt;If this feels overwhelming, don't boil the ocean. Start with one slice of functionality that everyone values and recognizes—maybe a login path or a simple checkout. Write just two or three scenarios, and make sure the wording mirrors how the business describes the flow. Delegate the mechanics to helpers, not the scenario text. Keep your domain logic in pure functions wherever possible so it's trivial to call from a &lt;code&gt;When&lt;/code&gt; step. And once you've got a couple of green runs, wire in some tags so you can choose between smoke tests, integration runs, or the full suite depending on your CI needs.&lt;/p&gt;

&lt;p&gt;As you go, pay attention to the words. If step text feels clumsy, it probably means your ubiquitous language is clumsy too. Refining that wording in collaboration with stakeholders isn't overhead—it's the work. And when naming friction crops up, it's often a smell that your design needs another seam or abstraction.&lt;/p&gt;




&lt;h3&gt;
  
  
  Fluent examples you can copy-paste
&lt;/h3&gt;

&lt;p&gt;Unit-level:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;csharp&lt;br&gt;
await Given("subtotal is $120 and tier is gold", () =&amp;gt; (Subtotal: 120m, Tier: "gold"))&lt;br&gt;
     .When("finalize price", x =&amp;gt; Pricing.Finalize(x.Tier, x.Subtotal))&lt;br&gt;
     .Then("applies 10% discount", price =&amp;gt; price == 108m)&lt;br&gt;
     .AssertPassed();&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;API-level:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;csharp&lt;br&gt;
await Given("a seeded test tenant", TestData.SeedTenantAsync)&lt;br&gt;
     .When("posting to /invoices", async _ =&amp;gt; await Api.PostAsync("/invoices", new { amount = 250 }))&lt;br&gt;
     .Then("returns 201", r =&amp;gt; r.StatusCode == 201)&lt;br&gt;
     .And("body contains invoice id", r =&amp;gt; r.Json.Value&amp;lt;string&amp;gt;("id") is { Length: &amp;gt; 0 })&lt;br&gt;
     .AssertPassed();&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;UI-level:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;csharp&lt;br&gt;
await Given("a logged-in admin", BrowserSteps.LoginAsAdminAsync)&lt;br&gt;
     .When("they create a user named Dana", page =&amp;gt; AdminUsers.CreateAsync(page, "Dana"))&lt;br&gt;
     .Then("Dana appears in the grid", page =&amp;gt; AdminUsers.ExistsAsync(page, "Dana"))&lt;br&gt;
     .AssertPassed();&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Readable Gherkin-style output
&lt;/h3&gt;

&lt;p&gt;One of the nicest touches in TinyBDD is how your scenarios &lt;em&gt;report themselves&lt;/em&gt; when you run the tests. Pair your scenarios with the appropriate base class (&lt;code&gt;TinyBddXunitBase&lt;/code&gt;, &lt;code&gt;TinyBddXunitV3Base&lt;/code&gt;, &lt;code&gt;TinyBddNUnitBase&lt;/code&gt;, or &lt;code&gt;TinyBddMSTestBase&lt;/code&gt;), and the test runner will print structured Gherkin output alongside normal results.  &lt;/p&gt;

&lt;p&gt;That means the Given/When/Then flow you wrote doesn't just execute—it shows up exactly as you'd expect, step by step, with timings and pass/fail indicators. It turns your test logs into living specifications.&lt;/p&gt;

&lt;p&gt;For example, here's the output from a mediator scenario:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;`&lt;/p&gt;

&lt;p&gt;Feature: Behavioral - Mediator (commands, notifications, streaming, behaviors)&lt;br&gt;
Scenario: Send: command handler runs through behaviors and returns value&lt;br&gt;
Given a mediator with pre/post/whole behaviors and a Ping-&amp;gt;Pong handler [OK] 2 ms&lt;br&gt;
When sending Ping(5) [OK] 4 ms&lt;br&gt;
Then result is pong:5 [OK] 2 ms&lt;br&gt;
And behaviors logged pre, whole before/after, and post [OK] 0 ms&lt;/p&gt;

&lt;p&gt;`&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Instead of squinting at assertions in code, you see a natural-language story of what happened. That's invaluable when sharing results with stakeholders or debugging failures in CI. And because the feature and scenario titles come from your test class and attributes, the logs stay consistent with the language you use in code reviews, planning, and conversations.&lt;/p&gt;




&lt;h3&gt;
  
  
  Avoiding common anti-patterns
&lt;/h3&gt;

&lt;p&gt;Every test framework accumulates bad habits if left unchecked, and TinyBDD is no exception. The most obvious trap is clever wording: steps like "When magic happens" don't help anyone and fail to serve as documentation. Instead, the wording should describe an intention that a stakeholder would immediately recognize, such as "When the admin disables the account". Another trap is letting a single step conceal multiple actions or checks. Keep your flow honest: &lt;code&gt;When&lt;/code&gt; should drive effects, and &lt;code&gt;Then&lt;/code&gt; or &lt;code&gt;And&lt;/code&gt; should assert results.&lt;/p&gt;

&lt;p&gt;Setup is another danger zone. If your &lt;code&gt;Given&lt;/code&gt; steps are littered with manual wiring of objects, it's time for factories or builders to take over. And at the UI layer, brittle selectors quickly make tests flaky; encapsulating them in page objects and using explicit test IDs pays off many times over. Avoiding these pitfalls keeps your suite readable, stable, and genuinely valuable.&lt;/p&gt;




&lt;h3&gt;
  
  
  The payoff
&lt;/h3&gt;

&lt;p&gt;When scenarios read like the business and execute like the code, something special happens. Your tests stop being just a safety net and start becoming living documentation. They never go stale because they're executable. They provide immediate feedback on drift, so change becomes safer. They subtly nudge your codebase toward SOLID principles and functional seams. And for new developers, they become the best possible onboarding guide: open the suite, read the stories, and understand how the system behaves.&lt;/p&gt;

&lt;p&gt;You don't need to retool your world to reach this point. Start with one scenario. Make it pass. Share it with your team. Repeat. In a few sprints, you'll have a suite of stories that stack from units to workflows, and a codebase that's easier to evolve because the behaviors are crystal clear.&lt;/p&gt;




&lt;h3&gt;
  
  
  Appendix — A quick PR lens
&lt;/h3&gt;

&lt;p&gt;As you review changes, ask yourself: does this PR add or update scenarios that the business would recognize? Do the steps read like natural English, each mapping to a single intent? Are the domain rules isolated in pure functions rather than tangled in infrastructure? Did we create or clarify a port instead of hard-coding dependencies? Can we tag and run this slice of scenarios independently in CI?&lt;/p&gt;

&lt;p&gt;If you can answer "yes" to most of those, you're not just writing tests—you're building shared understanding, guiding design, and accelerating delivery. That's the real promise of TinyBDD.&lt;/p&gt;




&lt;p&gt;👉 &lt;a href="https://github.com/jerrettdavis/tinybdd" rel="noopener noreferrer"&gt;Get TinyBDD on GitHub&lt;/a&gt; · &lt;a href="https://www.nuget.org/packages/TinyBDD" rel="noopener noreferrer"&gt;NuGet&lt;/a&gt; · &lt;a href="https://jerrettdavis.github.io/TinyBDD/" rel="noopener noreferrer"&gt;Docs&lt;/a&gt;&lt;/p&gt;

</description>
      <category>bdd</category>
      <category>tdd</category>
      <category>testing</category>
      <category>architecture</category>
    </item>
    <item>
      <title>JD.Efcpt.Build: Build‑Time EF Core Scaffolding to Keep Database‑First Models in Sync</title>
      <dc:creator>Jerrett Davis</dc:creator>
      <pubDate>Mon, 22 Dec 2025 05:43:45 +0000</pubDate>
      <link>https://dev.to/jerrettdavis/jdefcptbuild-build-time-ef-core-scaffolding-to-keep-database-first-models-in-sync-5b5n</link>
      <guid>https://dev.to/jerrettdavis/jdefcptbuild-build-time-ef-core-scaffolding-to-keep-database-first-models-in-sync-5b5n</guid>
      <description>&lt;h1&gt;
  
  
  "Where Did Database First Go?"
&lt;/h1&gt;

&lt;p&gt;If you were using Entity Framework when EF Core first dropped, you probably remember the moment you went looking for database-first support and found... nothing.&lt;/p&gt;

&lt;p&gt;EF Core launched as a code-first framework. The Reverse Engineer tooling that EF6 developers relied on (the right-click, point at a database, generate your models workflow) wasn't there. Microsoft's position was essentially "migrations are the future, figure it out." And if your team had an existing database, or a DBA who actually owned the schema, or compliance requirements that meant the database was the source of truth... well, good luck with that.&lt;/p&gt;

&lt;p&gt;The community's response was immediate and loud. "Where did database first go?" became a recurring theme in GitHub issues, Stack Overflow questions, and the quiet frustration of developers who just wanted to talk to their database without hand-writing a hundred entity classes.&lt;/p&gt;

&lt;p&gt;Eventually, tooling caught up. EF Core Power Tools emerged as the community answer: a Visual Studio extension that brought back the reverse engineering workflow. You could point it at a database or a DACPAC, configure some options, and generate your models. Problem solved, mostly.&lt;/p&gt;

&lt;p&gt;But here's the thing about manual processes: they work fine right up until they don't.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Problem That Keeps Happening
&lt;/h2&gt;

&lt;p&gt;I've spent enough time in codebases with legacy data layers to recognize a pattern. It goes something like this:&lt;/p&gt;

&lt;p&gt;A project starts with good intentions. Someone sets up EF Core Power Tools, generates the initial models, commits everything, and documents the process. "When the schema changes, regenerate the models using this tool with these settings." Clear enough.&lt;/p&gt;

&lt;p&gt;Then time passes.&lt;/p&gt;

&lt;p&gt;The developer who set it up leaves. The documentation gets stale. Someone regenerates with slightly different settings and commits the result. Someone else forgets to regenerate entirely after a schema change. The models drift. The configuration drifts. Nobody's quite sure what the "correct" regeneration process is anymore, so people just... stop doing it consistently.&lt;/p&gt;

&lt;p&gt;This isn't a dramatic failure. It's a slow erosion. The kind of problem that doesn't announce itself until you're debugging a production issue and realize the entity class doesn't have a column that's been in the database for six months.&lt;/p&gt;

&lt;p&gt;If you've worked in a codebase long enough, you've probably seen some version of this. Maybe you've been the person who discovered the drift. Maybe you've been the person who caused it. (No judgment. We've all been there.)&lt;/p&gt;

&lt;p&gt;The frustrating part is that the fix is always the same: regenerate the models, commit the changes, remind everyone to regenerate after schema changes. And then six months later, you're having the same conversation again.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Manual Regeneration Fails
&lt;/h2&gt;

&lt;p&gt;Let's be specific about what goes wrong, because understanding the failure modes is the first step toward fixing them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The ownership problem.&lt;/strong&gt; Whose job is it to regenerate after a schema change? The person who changed the schema? The person who owns the data layer? The tech lead? Nobody has a clear answer, which means sometimes everyone does it (chaos) and sometimes nobody does it (drift).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The configuration problem.&lt;/strong&gt; EF Core Power Tools stores settings in JSON files. Namespaces, nullable reference types, navigation property generation, renaming rules. There are dozens of options. If developers regenerate with different configurations, you get inconsistent output. Same database, different generated code.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The tooling problem.&lt;/strong&gt; Regeneration requires Visual Studio with the extension installed. CI servers don't have Visual Studio. New developers might not have the extension. Remote development setups might not support it. The process that works on one machine doesn't necessarily work on another.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The noise problem.&lt;/strong&gt; Regeneration often produces massive diffs. Property reordering, whitespace changes, attribute additions. Stuff that doesn't represent actual schema changes but clutters up the commit. Developers learn to distrust regeneration diffs, which makes them reluctant to regenerate, which makes the problem worse.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The timing problem.&lt;/strong&gt; Even when everyone knows the process, there's no enforcement. You can commit code that references a column the models don't have, and the build might still pass if nothing actually uses that code path yet. The error surfaces later, in a different context, when the connection to the original schema change is long forgotten.&lt;/p&gt;

&lt;p&gt;None of these are individually catastrophic. Together, they add up to a process that works in theory but fails in practice.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Idea
&lt;/h2&gt;

&lt;p&gt;Here's the thought that eventually became this project: if model generation can be invoked from the command line (and it can, via EF Core Power Tools CLI), then model generation can be part of the build.&lt;/p&gt;

&lt;p&gt;Not a separate step you remember to run. Not a manual process with unclear ownership. Just part of what happens when you run &lt;code&gt;dotnet build&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The build already knows how to compile your code. It already knows how to restore packages, run analyzers, produce artifacts. Adding "generate EF Core models from the schema" to that list isn't conceptually different from any other build-time code generation.&lt;/p&gt;

&lt;p&gt;If the build handles it, the ownership question disappears. The build owns it. If the build handles it with consistent configuration, the drift disappears. Everyone gets the same output. If the build handles it on every machine, the tooling problem disappears. No special extensions required.&lt;/p&gt;

&lt;p&gt;This is JD.Efcpt.Build: an MSBuild integration that makes EF Core model generation automatic.&lt;/p&gt;




&lt;h2&gt;
  
  
  How It Actually Works
&lt;/h2&gt;

&lt;p&gt;The package hooks into your build through MSBuild targets that run before compilation. When you build, it:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Finds your schema source.&lt;/strong&gt; Either a SQL Server Database Project (&lt;code&gt;.sqlproj&lt;/code&gt;) that gets compiled to a DACPAC, or a connection string pointing to a live database.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Computes a fingerprint.&lt;/strong&gt; A hash of all the inputs: the DACPAC or schema metadata, the configuration file, the renaming rules, any custom templates. This fingerprint represents "the current state of everything that affects generation."&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Compares to the previous fingerprint.&lt;/strong&gt; If they match, nothing changed, and generation is skipped. If they differ, something changed, and generation runs.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Generates models.&lt;/strong&gt; Using EF Core Power Tools CLI, same as you'd run manually, but automated. Output goes to &lt;code&gt;obj/efcpt/Generated/&lt;/code&gt; with a &lt;code&gt;.g.cs&lt;/code&gt; extension.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Adds generated files to compilation.&lt;/strong&gt; Automatically. You don't edit your project file or manage includes.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The fingerprinting is what makes this practical. You don't want generation running on every build. That would be slow and developers would hate it. The fingerprint check is fast (XxHash64, designed for exactly this kind of content comparison), so incremental builds have essentially zero overhead. Generation only runs when inputs actually change.&lt;/p&gt;




&lt;h2&gt;
  
  
  Two Ways to Get Your Schema
&lt;/h2&gt;

&lt;p&gt;Different teams manage database schemas differently, so the package supports two modes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DACPAC Mode&lt;/strong&gt; is for teams with SQL Server Database Projects. You have a &lt;code&gt;.sqlproj&lt;/code&gt; that defines your schema in version-controlled SQL files. The package builds this project to produce a DACPAC, then generates models from that DACPAC.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;PropertyGroup&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;EfcptSqlProj&amp;gt;&lt;/span&gt;..\Database\MyDatabase.sqlproj&lt;span class="nt"&gt;&amp;lt;/EfcptSqlProj&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/PropertyGroup&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is nice because your schema is code. It lives in source control. Changes go through pull requests. The DACPAC is a build artifact, and models are derived from that artifact deterministically.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Connection String Mode&lt;/strong&gt; is for teams without database projects. Maybe you apply migrations to a dev database and want to scaffold from that. Maybe you're working against a cloud database. Maybe you just don't want to deal with DACPACs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;PropertyGroup&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;EfcptConnectionString&amp;gt;&lt;/span&gt;$(DB_CONNECTION_STRING)&lt;span class="nt"&gt;&amp;lt;/EfcptConnectionString&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/PropertyGroup&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The package connects, queries system tables to understand the schema, and generates from that. The fingerprint is computed from the schema metadata, so incremental builds still work. If the schema hasn't changed, generation is skipped.&lt;/p&gt;

&lt;p&gt;Both modes use the same configuration files and produce the same kind of output. They just differ in where the schema comes from.&lt;/p&gt;




&lt;h2&gt;
  
  
  Setting It Up
&lt;/h2&gt;

&lt;p&gt;The minimum setup is almost trivial:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;ItemGroup&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;PackageReference&lt;/span&gt; &lt;span class="na"&gt;Include=&lt;/span&gt;&lt;span class="s"&gt;"JD.Efcpt.Build"&lt;/span&gt; &lt;span class="na"&gt;Version=&lt;/span&gt;&lt;span class="s"&gt;"1.0.0"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/ItemGroup&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you have a &lt;code&gt;.sqlproj&lt;/code&gt; in your solution and an &lt;code&gt;efcpt-config.json&lt;/code&gt; in your project directory, that's it. Run &lt;code&gt;dotnet build&lt;/code&gt; and models appear.&lt;/p&gt;

&lt;p&gt;For more control, you add configuration. The &lt;code&gt;efcpt-config.json&lt;/code&gt; controls generation behavior:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"names"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"root-namespace"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"MyApp.Data"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"dbcontext-name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ApplicationDbContext"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"code-generation"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"use-nullable-reference-types"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"enable-on-configuring"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;enable-on-configuring: false&lt;/code&gt; means your DbContext won't have a hardcoded connection string. You configure that in your DI container, where it belongs.&lt;/p&gt;

&lt;p&gt;If your database uses naming conventions that don't map cleanly to C#, you add renaming rules:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"SchemaName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"dbo"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Tables"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"tbl_Users"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"NewName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"User"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"Columns"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"user_id"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"NewName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Id"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now &lt;code&gt;tbl_Users.user_id&lt;/code&gt; becomes &lt;code&gt;User.Id&lt;/code&gt;. The database can keep its conventions, your C# code can have its conventions, and the mapping is explicit and version-controlled.&lt;/p&gt;




&lt;h2&gt;
  
  
  What About Custom Code?
&lt;/h2&gt;

&lt;p&gt;A reasonable concern: "I have computed properties and validation methods on my entities. Won't regeneration overwrite those?"&lt;/p&gt;

&lt;p&gt;This is what partial classes are for.&lt;/p&gt;

&lt;p&gt;The generated entity is one half:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// obj/efcpt/Generated/User.g.cs&lt;/span&gt;
&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;partial&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;User&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;Id&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;get&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;Email&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;get&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;FirstName&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;get&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;LastName&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;get&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your custom logic is the other half:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Models/User.cs&lt;/span&gt;
&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;partial&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;User&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;FullName&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s"&gt;$"&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;FirstName&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;LastName&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;bool&lt;/span&gt; &lt;span class="n"&gt;HasValidEmail&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;Email&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nf"&gt;Contains&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"@"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;??&lt;/span&gt; &lt;span class="k"&gt;false&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Both compile into a single class. The generated half gets regenerated every build. Your custom half stays exactly as you wrote it.&lt;/p&gt;

&lt;p&gt;This separation is actually cleaner than mixing generated and custom code in the same file. You know at a glance what's generated (&lt;code&gt;.g.cs&lt;/code&gt; in &lt;code&gt;obj/&lt;/code&gt;) and what's yours (everything else).&lt;/p&gt;




&lt;h2&gt;
  
  
  CI/CD Without Special Steps
&lt;/h2&gt;

&lt;p&gt;One of the pain points I mentioned earlier was CI/CD. Manual regeneration doesn't work in automated pipelines. You're stuck either committing generated code (merge conflicts) or maintaining custom regeneration scripts (fragile).&lt;/p&gt;

&lt;p&gt;With the build handling generation, CI just works:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Setup .NET&lt;/span&gt;
    &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-dotnet@v4&lt;/span&gt;
    &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;dotnet-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;10.0.x'&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Build&lt;/span&gt;
    &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dotnet build --configuration Release&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;DB_CONNECTION_STRING&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DB_CONNECTION_STRING }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;No special steps for EF Core generation. The build handles it. On .NET 10+, the package uses &lt;code&gt;dotnet dnx&lt;/code&gt; to execute the tool directly from the package feed without requiring installation. On older versions, it uses tool manifests or global tools.&lt;/p&gt;

&lt;p&gt;Pull requests that include schema changes automatically include the corresponding model changes, because both happen during the build. Schema and code are validated together.&lt;/p&gt;




&lt;h2&gt;
  
  
  When Things Go Wrong
&lt;/h2&gt;

&lt;p&gt;Things will go wrong. Here's how you figure out what happened.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enable verbose logging:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;PropertyGroup&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;EfcptLogVerbosity&amp;gt;&lt;/span&gt;detailed&lt;span class="nt"&gt;&amp;lt;/EfcptLogVerbosity&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/PropertyGroup&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Build output now includes exactly what's happening: which inputs were found, what fingerprint was computed, whether generation ran or was skipped.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Check the resolved inputs:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After a build, look at &lt;code&gt;obj/efcpt/resolved-inputs.json&lt;/code&gt;. This shows exactly what the package found for each input. If something's wrong, you'll see it here.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Inspect the fingerprint:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The fingerprint is stored at &lt;code&gt;obj/efcpt/fingerprint.txt&lt;/code&gt;. If generation is running unexpectedly (or not running when it should), the fingerprint tells you whether inputs changed from the package's perspective.&lt;/p&gt;




&lt;h2&gt;
  
  
  Who This Is For
&lt;/h2&gt;

&lt;p&gt;I want to be honest about fit.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;This is probably for you if:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You're doing database-first development and you've experienced the regeneration coordination problem. The "did someone regenerate?" question has come up, and the answer wasn't always clear.&lt;/p&gt;

&lt;p&gt;Your schema changes regularly. If you're shipping schema changes weekly, manual regeneration becomes friction.&lt;/p&gt;

&lt;p&gt;You want builds that work identically everywhere. Local machines, CI servers, new developer laptops. Everyone should get the same generated code from the same inputs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;This probably isn't for you if:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Your schema is essentially static. If schema changes are rare, manual regeneration isn't that painful.&lt;/p&gt;

&lt;p&gt;You're using code-first migrations. If migrations are your source of truth, you're solving a different problem.&lt;/p&gt;

&lt;p&gt;You're not using EF Core Power Tools already. This package automates EF Core Power Tools; if you're using a different generation approach, this doesn't apply.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Groan, Addressed
&lt;/h2&gt;

&lt;p&gt;"Where did database first go?"&lt;/p&gt;

&lt;p&gt;It's been years since EF Core launched without reverse engineering, and the tooling has caught up. EF Core Power Tools exists. The CLI exists. The capability is there.&lt;/p&gt;

&lt;p&gt;But capability isn't the same as workflow. Having the tools isn't the same as having a process that works reliably across a team, across time, across environments.&lt;/p&gt;

&lt;p&gt;JD.Efcpt.Build is an attempt to close that gap. To take the capability that exists and make it automatic. To make the build the owner of model generation, so humans don't have to remember to do it.&lt;/p&gt;

&lt;p&gt;Your database schema is the source of truth. This package just makes sure your code reflects that truth, every time you build, without manual intervention.&lt;/p&gt;

&lt;p&gt;One less thing to coordinate. One less thing to forget. One less thing to go wrong in production because a manual step got skipped.&lt;/p&gt;

&lt;p&gt;That's the pitch. Give it a try if it fits your situation.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;JD.Efcpt.Build is &lt;a href="https://github.com/JerrettDavis/JD.Efcpt.Build" rel="noopener noreferrer"&gt;open source&lt;/a&gt; and available on &lt;a href="https://www.nuget.org/packages/JD.Efcpt.Build" rel="noopener noreferrer"&gt;NuGet&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>database</category>
      <category>automation</category>
      <category>programming</category>
    </item>
    <item>
      <title>You Don't Hate Abstractions</title>
      <dc:creator>Jerrett Davis</dc:creator>
      <pubDate>Mon, 08 Dec 2025 01:08:42 +0000</pubDate>
      <link>https://dev.to/jerrettdavis/you-dont-hate-abstractions-3gl8</link>
      <guid>https://dev.to/jerrettdavis/you-dont-hate-abstractions-3gl8</guid>
      <description>&lt;p&gt;It’s an hour until you’re free for the weekend, and you’re trying to knock out one&lt;br&gt;
last ticket before you escape into whatever assuredly action-packed plans await you.&lt;br&gt;
You spot a seemingly harmless task: "Add Middle Initial to User Name Display."&lt;/p&gt;

&lt;p&gt;You chuckle. Easy. A palate cleanser. A victory lap.&lt;br&gt;&lt;br&gt;
You assign the ticket, flip it from &lt;code&gt;New&lt;/code&gt; to &lt;code&gt;Active&lt;/code&gt;, and let your IDE warm up&lt;br&gt;&lt;br&gt;
while you drift into a pleasant daydream about not being here.&lt;/p&gt;

&lt;p&gt;But then the search results begin to appear.&lt;br&gt;&lt;br&gt;
Slowly. Line by line.&lt;/p&gt;

&lt;p&gt;And your reverie begins to rot.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;IUserNameStrategy&lt;/code&gt;, &lt;code&gt;UserNameContext&lt;/code&gt;, &lt;code&gt;UserNameDisplayStrategyFactory&lt;/code&gt;,&lt;br&gt;&lt;br&gt;
&lt;code&gt;StandardUserNameDisplayStrategy&lt;/code&gt;, &lt;code&gt;FormalUserNameDisplayStrategy&lt;/code&gt;,&lt;br&gt;&lt;br&gt;
&lt;code&gt;InformalUserNameDisplayStrategy&lt;/code&gt;, &lt;code&gt;UserNameDisplayModule&lt;/code&gt;, …&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Incredulous, your chuckle hardens into a throat-scraping noise somewhere&lt;br&gt;
between a laugh and a cry.&lt;/p&gt;

&lt;p&gt;"What in the Gang-of-Fuck is happening here," you think, feeling your pulse tick up.&lt;br&gt;
"Either someone read &lt;a href="https://refactoring.guru" rel="noopener noreferrer"&gt;Refactoring.Guru&lt;/a&gt; like it was scripture and decided to&lt;br&gt;
baptize the codebase in every pattern they learned, or a grizzled enterprise&lt;br&gt;
veteran escaped some Netflix-adjacent monolith and is trying to upskill in&lt;br&gt;
TypeScript. Because surely no sane developer would build this much redirection&lt;br&gt;
for such a trivial feature… right?"&lt;/p&gt;

&lt;p&gt;That tiny, spiraling task is a perfect microcosm of a continuous debate across engineering circles: when does abstraction help, and when does it become a hindrance?&lt;/p&gt;



&lt;p&gt;I recently stumbled across the humorous article &lt;a href="https://dev.to/adamthedeveloper/youre-not-building-netflix-stop-coding-like-you-are-1707?ref=dailydev"&gt;You're Not Building Netflix: Stop Coding Like You Are&lt;/a&gt; by Adam — The Developer. But though it resonates in many ways, its broader critique is ultimately misdirected.&lt;/p&gt;

&lt;p&gt;Adam opens with a more complete version of the code mocked in my prologue, and uses the verbosity and obscurity of that abstraction pile as the springboard for a rebuke of using enterprise patterns pretty much across the board. While the author does allow for some abstractions, they're limited in application and scope.&lt;/p&gt;

&lt;p&gt;The problem isn’t that the complaint is wrong. It’s that it points the finger at the wrong culprit, just about literally missing the forest for the trees.&lt;/p&gt;

&lt;p&gt;Abstractions are fundamental and essential. They are the elementary particles of software, the quarks and leptons that bind into the subatomic structures that become the atoms our earliest techno-wizards fused into molecules. Today we combine those same basic elements into the compounds and contraptions made for use by millions. Without abstractions, we are left helpless in the ever-increasing parallel streams of pulsating electrical currents, rushing through specialized, intricately forged rocks that artisan wizards once trapped lightning inside and somehow convinced to think.&lt;/p&gt;

&lt;p&gt;But even with all that power at our disposal, the way we use these building blocks matters. Chemistry offers a fitting parallel. Food chemists, for example, have spent decades learning how to repurpose industrial byproducts into stabilizers, textures, preservatives, and anything else that can be quietly slipped into a recipe. Much of this work is impressive and innovative, but some of it is little more than creative waste disposal disguised as convenience: a brilliant hack in the short term and a lingering problem in the long one.&lt;/p&gt;

&lt;p&gt;Developers can fall into the same pattern. We learn a new technique or pattern or clever trick and then spend the next year pouring it into every beaker we can find. We are not always discovering better processes. Sometimes we are just repackaging the same product and calling it progress. When that happens, we are not solving problems. We are manufacturing new ones that future maintainers will curse our names over.&lt;/p&gt;

&lt;p&gt;A developer must be architect, engineer, mechanic, and driver all at once. It is fine to know how to fix a specific issue, but once that problem is solved, that knowledge should become a building block for solving the next one. If we keep returning to maintain the same solution day after day, then what we built was never a solution at all. It was a slow-burning maintenance burden that we misfiled as "done."&lt;/p&gt;

&lt;p&gt;Abstractions exist to reduce complexity, not to multiply it. Their purpose is to lighten the cognitive load, to lift the details off your desk so you can see the shape of the problem in front of you. Terse, repetitive, wire-on-the-floor code that looks like it tumbled out of a &lt;a href="https://youtu.be/MvEXkd3O2ow" rel="noopener noreferrer"&gt;flickering green CRT from 1999 may impress the authors who have stared at machine code long enough to discern hair color from a data stream&lt;/a&gt;, but it does not serve the broader team or the system that outlives them. Abstractions only do their job when they are aligned with the actual problem being solved, and that brings us to the part many developers skip entirely: modeling your software after the problem you are solving.&lt;/p&gt;
&lt;h2&gt;
  
  
  Seeing the Problem Before Solving It
&lt;/h2&gt;

&lt;p&gt;When you build a system, any system, even a disposable script, the first responsibility is understanding why it exists. What problem does it address. Has that problem been solved before. If so, what makes the existing solution insufficient for you now. Understanding that difference is the foundation that everything else must sit on.&lt;/p&gt;

&lt;p&gt;I learned this the hard way as a homeowner. My house is old enough to have grounded me if I talked back to it as a teenager. A couple of years ago we went through a near-total remodel. We did some work before and shortly after our daughter was born, but within a year new problems started surfacing. We brought in a structural engineer. The slab foundation was heaving. After some exploration we discovered the culprit: the original cast iron sewage line had split along both the top and bottom, creating pressure changes and settling issues throughout the house.&lt;/p&gt;

&lt;p&gt;The fix was not small. We pulled up every inch of flooring. Replaced baseboards. Repaired drywall. Fixed the broken line. Repainted entire sections. Redid trim. Installed piers. Pumped in foundation foam. Cashed in favors. Lost many weekends. And yet, even with all that, it still cost far less than buying an equivalent house in the current market at the current rates.&lt;/p&gt;

&lt;p&gt;The lesson is simple. Things are rarely a total loss. Even when a structure looks hopeless, even when someone has effectively set fire to best practices, even when regulations or markets or technologies have shifted dramatically, there are almost always assets worth salvaging inside the wreckage. You should not bulldoze unless you know you have truly exhausted the alternatives.&lt;/p&gt;

&lt;p&gt;Before throwing away any system and starting another from scratch, assess what you already have. Understand what is broken, what is sound, and what simply needs reinforcement. Software, like houses, tends to rot in specific places for specific reasons. Understanding those reasons is what separates renovation from reinvention.&lt;/p&gt;
&lt;h2&gt;
  
  
  The Nightstand Problem
&lt;/h2&gt;

&lt;p&gt;The same principle applies at a smaller scale. You may already own a perfectly functional house with perfectly functional furniture, yet still want a nightstand you do not currently possess. Your choices are straightforward. You can hope someone has decided to let go of one that happens to meet your criteria. That is the open source gamble. You can buy one, constrained only by budget and whatever definition of quality the manufacturer is committed to that week. Or you can build one yourself, limited only by your skills, imagination, and tolerance for sawdust.&lt;/p&gt;

&lt;p&gt;If your goal is personal satisfaction or experimentation, then by all means build the nightstand. But if your goal is to sell or support a product that helps make money, you are no longer just hobby-carpenting. You are operating in the domain of enterprise software.&lt;/p&gt;

&lt;p&gt;And when you are building enterprise software, you must view the system from the top down while designing from the bottom up. From the top down, you think about every consumer of your system. In academic terms these are actors. Any system intended to be used, whether by humans or machines, is defined by the interactions between its actors and its responsibilities. Even an autonomous system is both an actor and the architected environment it operates within.&lt;/p&gt;

&lt;p&gt;This perspective matters because it forces your abstractions to model the real world rather than some internal taxonomy of clever names. Good abstractions emerge from an understanding of the domain. Bad abstractions emerge from an understanding of a design pattern book.&lt;/p&gt;

&lt;p&gt;And if you want maintainability, clarity, and longevity, you always want the first.&lt;/p&gt;
&lt;h2&gt;
  
  
  Building from Both Directions
&lt;/h2&gt;

&lt;p&gt;Designing software means working from two directions at once. On one hand, you must understand the behavior your system must exhibit. On the other hand, you must understand the shape of the world it lives in. Systems are not invented whole cloth; they crystallize out of the interactions between intentions and constraints. If you ignore either direction, you end up with something brittle, confused, overbuilt, or perpetually unfinished.&lt;/p&gt;

&lt;p&gt;There is nothing sacred about any particular architectural style. Pick Domain-Driven, Clean, Vertical Slice, Hexagonal, Layered, or something entirely different. The choice matters far less than your consistency and your commitment to encapsulating concerns properly. Different problems require different arrangements of the same conceptual ingredients. From high altitude, two domains may look identical. Once you descend toward the details, you often discover that one is a bird and the other is an airplane. The trick is knowing when to zoom out and when to zoom in.&lt;/p&gt;

&lt;p&gt;Plenty of developers jump immediately into code, but the outside of the system is always the real beginning. What is it supposed to do. Who uses it. Who does it talk to. Who builds it. Who runs it. Who deploys it. Who monitors it. How do you prove it works. These questions define the problem space, and the problem space determines the boundaries and responsibilities your abstractions must reflect.&lt;/p&gt;

&lt;p&gt;Even something as small as a script must obey this reality.&lt;/p&gt;

&lt;p&gt;Consider a simple provisioning script. First it reads a certificate from the local filesystem so it can authenticate with a remote host. Next it opens an SFTP connection to a distribution server and retrieves a zip file. Then it extracts the archive to a temporary directory provided by the operating system. Finally it executes whatever installers or configuration commands the archive contains.&lt;/p&gt;

&lt;p&gt;On the surface this is straightforward, yet every step is shaped by the environment in which it operates. Tools differ between platforms. Available executables change. File paths and separators vary. Temporary directory locations vary. Even the existence or reliability of SFTP clients varies. None of this means we must implement every possible alternative upfront, but it does mean we should acknowledge the existence of alternatives and avoid designing ourselves into a corner where adding support later requires rewriting the entire script.&lt;/p&gt;

&lt;p&gt;This principle scales upward. You may choose to place your application data inside a database, but scattering SQL statements across your codebase is an anti-pattern in nearly any architecture not explicitly about database engines or ORM internals. Unless you are writing an RDBMS, data access is rarely the star of the show. The real substance lives in the application logic that interprets, transforms, regulates, or composes that data. Mixing data access concerns directly into that logic creates friction. Separating them reduces friction, which improves maintainability, which improves confidence, which improves speed.&lt;/p&gt;

&lt;p&gt;The guiding question is always the same: does this choice help my system model the problem more clearly, or does it merely model my current implementation?&lt;br&gt;
If it is the former, great. If it is the latter, you are accumulating technical debt even if the code looks clean.&lt;/p&gt;

&lt;p&gt;Abstractions aligned with the domain allow your system to grow gracefully. But abstractions aligned with your tooling force your system to grow awkwardly and inconsistently.&lt;/p&gt;

&lt;p&gt;This is the difference between designing from both directions and designing from just one.&lt;/p&gt;
&lt;h2&gt;
  
  
  Behavior as the Backbone of Architecture
&lt;/h2&gt;

&lt;p&gt;At some point in every software project, the discussion inevitably turns to architecture. Engineers debate whether they should adopt Domain-Driven Design or Clean Architecture, whether their services ought to be hexagonal, layered, vertical-sliced, modular, or some other fashionable geometric configuration, and whether interfaces belong everywhere or nowhere at all. These conversations are interesting, even entertaining, but they often drift into abstraction for abstraction’s sake. The problem is rarely the patterns themselves; rather, it is that these debates frequently occur in a vacuum, disconnected from the actual behaviors the system must exhibit. Humans love patterns, but software only cares about whether it does the right thing.&lt;/p&gt;

&lt;p&gt;The most reliable way to design a system, therefore, is to begin with its behavior. A system exists to do something, and if we do not articulate that something clearly, everything downstream becomes guesswork and improvisation. This is precisely where behavior-driven development demonstrates its value. I explore this more deeply in &lt;a href="https://jerrettdavis.com/blog/posts/making-the-business-write-your-tests-with-bdd" rel="noopener noreferrer"&gt;BDD: Make the Business Write Your Tests&lt;/a&gt;, but in short, BDD forces us to express the responsibilities of the system in language that is precise, verifiable, and shared by both technical and nontechnical stakeholders. A behavior becomes a specification, a test, a boundary, and a contract all at once.&lt;/p&gt;

&lt;p&gt;From an architectural perspective, this shift in thinking is transformative. When we model the largest and most meaningful behaviors first and place an abstraction around them, we create an outer shell that defines the system at a conceptual level. From there, we move inward, breaking behaviors down iteratively into smaller and more specific responsibilities. Each division suggests a natural abstraction, but these abstractions are not arbitrary. They emerge directly from the behavior they represent. They are shaped not by the developer’s preferred patterns but by the needs of the domain itself. This recursive approach ensures that abstractions mirror intent rather than implementation details.&lt;/p&gt;

&lt;p&gt;Importantly, this recursion is not fractal. We are not attempting to subdivide reality endlessly. Rather, we refine behaviors only until they are sufficiently well understood to be implemented cleanly. Much as one does not explain quantum chromodynamics to teach someone how to scramble an egg, we do not decompose software beyond what clarity and accuracy require. And while many languages encourage the use of interfaces as the primary mechanism for abstraction, the interface is not the abstraction itself. It is merely a convenient way to enforce a contract. The real abstraction is the conceptual boundary it represents. Whether that boundary is expressed as an interface, a type, a configuration object, or a module is irrelevant as long as the contract is clear and consistent.&lt;/p&gt;

&lt;p&gt;This is why starting with abstractions like an &lt;code&gt;IHost&lt;/code&gt; that orchestrates an &lt;code&gt;IApplication&lt;/code&gt; works so well. These constructs mirror the system’s highest-level behaviors. Once defined, they allow us to drill inward, step by step, carving out responsibilities until the domain takes shape as a set of interlocking, behavior-aligned components. When abstractions are created this way, they tend to be stable. They align with the problem domain rather than the transient needs of a particular implementation, and therefore they seldom need to change unless the underlying behavior changes.&lt;/p&gt;

&lt;p&gt;Frequent modification of an abstraction is a warning sign. A well-formed abstraction typically changes only under three conditions: the business behavior has evolved, an overlooked edge case has surfaced, or the original abstraction contained a conceptual flaw. Outside of those circumstances, the need to repeatedly modify an abstraction usually indicates that its boundaries were drawn incorrectly. When adjusting one behavior forces changes across multiple components, the issue is rarely "too many" or "too few" abstractions in an abstract sense. Instead, it is a failure of alignment. The abstraction does not adequately contain the concerns it is supposed to model, and complexity is leaking out of its container and into the rest of the codebase.&lt;/p&gt;

&lt;p&gt;Modern tooling makes this problem even more evident. With the availability of source generators, analyzers, expressive type systems, code scaffolding, and dynamic configuration pipelines, there is increasingly little justification for sprawling boilerplate or repetitive structural code. Boilerplate is not a mark of engineering rigor. It is simply untested and uninteresting glue repeated dozens of times because someone did not take steps to automate it. Good abstractions, by contrast, elevate meaning. They allow the domain to be expressed directly without forcing the developer to wade through noise.&lt;/p&gt;

&lt;p&gt;This leads naturally to what I consider the ideal state of modern development: a system that is entirely automated from the moment code touches a repository until the moment it reaches a production-like environment. Compilation, testing, packaging, deployment, orchestration, and infrastructure provisioning should not require human involvement. The only manual step should be expressing intent in the form of new or updated behaviors. Every function that exists within the system should originate as a behavior-driven specification capable of running the entire application inside a controlled test environment, complete with containerized dependencies and UI automation tools such as Playwright. Those same tests should also be able to stub dependencies so the scenarios can run in isolation. When the system itself is treated as the first unit under test, orchestration becomes a priority rather than an afterthought.&lt;/p&gt;

&lt;p&gt;Achieving this level of automation depends on stability, and that stability depends on disciplined abstraction. Any element that may vary across environments, including configuration values, credentials, infrastructure, connection details, and policies, must be isolated behind settings and contracts that the application can consume without knowing anything about the environment it runs in. Once this encapsulation is in place, behavior-driven specifications can operate confidently, verifying the correctness of the system from the outside in even while its internal components remain free to evolve.&lt;/p&gt;

&lt;p&gt;Finally, it is worth stating explicitly that hand-writing repetitive boilerplate code in a CRUD-heavy application, such as repositories, controllers, mappers, DTOs, validators, or entire edge-to-edge layers, is not admirable craftsmanship. It is busywork. If you have twenty entities with identical structural behavior and you are manually writing twenty sets of nearly identical files, the issue is not insufficient discipline. It is insufficient automation. Whether through source generators, templates, reflection-based pipelines, or dynamic modules, these problems can and should be solved generically. Engineers should focus their manual effort on the places where meaning lives: the domain, the behavior, and the boundaries.&lt;/p&gt;

&lt;p&gt;Good abstractions do not eliminate complexity; they contain it. Bad abstractions distribute it. And behavior-driven, problem-first design is how we tell the difference.&lt;/p&gt;
&lt;h2&gt;
  
  
  From Story to Spec: Describing Behavior First
&lt;/h2&gt;

&lt;p&gt;To make this concrete, return to our original "Add Middle Initial to User Name Display" ticket. Most teams would handle this with a couple of unit tests directly against whatever &lt;code&gt;UserNameService&lt;/code&gt; or &lt;code&gt;UserNameFormatter&lt;/code&gt; happens to exist. The tests would exercise a particular class, call a particular method, and assert on a particular string. That can work, but it starts at the implementation, not at the behavior.&lt;/p&gt;

&lt;p&gt;If instead we begin with behavior, the specification sounds more like this:&lt;/p&gt;

&lt;p&gt;When a user has a middle name, show the middle initial between the first and last name.&lt;br&gt;
 When a user does not have a middle name, omit the gap entirely.&lt;br&gt;
 When a display style changes (for example, "formal" versus "informal"), the rules about how the middle initial appears should still hold.&lt;/p&gt;

&lt;p&gt;That is the contract. It does not mention classes, factories, or strategies. It talks about what the system must do from the outside.&lt;/p&gt;

&lt;p&gt;With something like my project &lt;a href="https://github.com/jerrettdavis/TinyBDD" rel="noopener noreferrer"&gt;TinyBDD&lt;/a&gt;, that kind of behavior becomes executable in a fairly direct way. Using the xUnit adapter, a scenario might look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;TinyBDD.Xunit&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;Xunit&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Feature&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"User name display"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;UserNameDisplayScenarios&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;TinyBddXunitBase&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Scenario&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Standard display includes middle initial when present"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Fact&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;MiddleInitialIsRenderedWhenPresent&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;Given&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"a user with first, middle, and last name"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;UserName&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Ada"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"M"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"Lovelace"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;When&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"formatting the user name for standard display"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;user&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="n"&gt;UserNameDisplay&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Standard&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;user&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"the result places the middle initial between first and last"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;formatted&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="n"&gt;formatted&lt;/span&gt; &lt;span class="p"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"Ada M. Lovelace"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AssertPassed&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Scenario&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Standard display omits missing middle initial"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Fact&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;NoMiddleInitialWhenMissing&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;Given&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"a user with only first and last name"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;UserName&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Ada"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"Lovelace"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;When&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"formatting the user name for standard display"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;user&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="n"&gt;UserNameDisplay&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Standard&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;user&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"no dangling spaces or periods appear"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;formatted&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="n"&gt;formatted&lt;/span&gt; &lt;span class="p"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"Ada Lovelace"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AssertPassed&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In these scenarios, the behavior is the first-class citizen. The test does not care whether you use a &lt;code&gt;UserNameDisplayStrategyFactory&lt;/code&gt;, a dependency-injected &lt;code&gt;IUserNameFormatter&lt;/code&gt;, or a static helper hidden in a dusty corner of your codebase. It cares that given a user, when you format their name, you get the right string.&lt;/p&gt;

&lt;p&gt;The abstractions are already visible in the code, but only as a side effect of expressing behavior:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;UserName&lt;/code&gt; represents the domain concept of a person’s name, not a UI or persistence model.&lt;br&gt;
 &lt;code&gt;UserNameDisplay.Standard&lt;/code&gt; represents a particular display style that the business cares about.&lt;br&gt;
 The behavior is encoded in the transition from &lt;code&gt;UserName&lt;/code&gt; to the formatted string, not in a particular class hierarchy.&lt;/p&gt;

&lt;p&gt;Notice what is not present: we do not have separate strategies for every permutation of name structure, locale, and display preference. We have a single coherent abstraction around "displaying a user name in the standard way," and the test drives the rules we actually need.&lt;/p&gt;
&lt;h2&gt;
  
  
  Letting Abstractions Fall Out of the Domain
&lt;/h2&gt;

&lt;p&gt;Once you have a behavior-focused spec, the abstractions almost draw themselves. One reasonable implementation might look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;sealed&lt;/span&gt; &lt;span class="k"&gt;record&lt;/span&gt; &lt;span class="nc"&gt;UserName&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;First&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="n"&gt;Middle&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;Last&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;interface&lt;/span&gt; &lt;span class="nc"&gt;IUserNameDisplay&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="nf"&gt;Format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;UserName&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;sealed&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;StandardUserNameDisplay&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;IUserNameDisplay&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="nf"&gt;Format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;UserName&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(!&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;IsNullOrWhiteSpace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Middle&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s"&gt;$"&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;First&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Middle&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="m"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]}&lt;/span&gt;&lt;span class="s"&gt;. &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Last&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;

        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s"&gt;$"&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;First&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Last&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;UserNameDisplay&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="k"&gt;readonly&lt;/span&gt; &lt;span class="n"&gt;IUserNameDisplay&lt;/span&gt; &lt;span class="n"&gt;Standard&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;StandardUserNameDisplay&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is not an argument that every trivial formatting problem deserves an interface and a concrete class. You could inline this logic in a static helper and your tests above would still pass. The point is that the abstraction here is small, meaningful, and directly aligned with the behavior we care about. If later the domain grows to include multiple display styles, cultures, or localization concerns, there is already a clear seam to extend. You can introduce additional &lt;code&gt;IUserNameDisplay&lt;/code&gt; implementations where and when they are genuinely needed, not because a pattern catalog declared that every noun deserves a factory.&lt;/p&gt;

&lt;p&gt;If, however, you discover that adding a new behavior requires touching half the classes in the system, that is a sign you have modeled implementation variants rather than domain concepts. The behavior spec remains constant; the code churn reveals where your abstractions are misaligned.&lt;/p&gt;

&lt;h2&gt;
  
  
  Scaling the Same Idea Up to the System Level
&lt;/h2&gt;

&lt;p&gt;So far this is all very local. A name goes in, a formatted string comes out. Real systems have much more interesting behaviors: accepting traffic, orchestrating workflows, integrating with external services, healing from transient failures, deploying safely, and so on.&lt;/p&gt;

&lt;p&gt;The same discipline still applies. You can treat the application itself as the unit under test and express its behavior with the same style of specification. A high-level scenario might read something like this:&lt;/p&gt;

&lt;p&gt;Given a configured application host and its dependencies&lt;br&gt;
 When the host starts&lt;br&gt;
 Then the public API responds to a health probe&lt;br&gt;
 And all critical services report healthy&lt;br&gt;
 And any failing dependency is surfaced clearly rather than silently ignored&lt;/p&gt;

&lt;p&gt;As an executable TinyBDD scenario, that might look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;TinyBDD.Xunit&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;Xunit&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Feature&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Application startup and health"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ApplicationHealthScenarios&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;TinyBddXunitBase&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Scenario&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Host starts and exposes a healthy API surface"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Fact&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;HostStartsAndReportsHealthy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;Given&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"a test host with default configuration"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="n"&gt;TestApplicationHost&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;CreateDefault&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;When&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"the host is started"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StartAsync&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
                &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
            &lt;span class="p"&gt;})&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"the health endpoint returns OK"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;AssertHealthEndpointOk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"/health"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;And&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"all critical health checks pass"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
                &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;AssertCriticalChecksPass&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AssertPassed&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;AssertHealthEndpointOk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;TestApplicationHost&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="c1"&gt;// This could exercise a real HTTP endpoint against a TestServer or containerized instance.&lt;/span&gt;
        &lt;span class="c1"&gt;// The assertion lives here, but the behavior is defined in the scenario above.&lt;/span&gt;
        &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;NotImplementedException&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;AssertCriticalChecksPass&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;TestApplicationHost&lt;/span&gt; &lt;span class="n"&gt;host&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="c1"&gt;// Could query IHealthCheckPublisher, metrics, logs, or an in-memory probe endpoint.&lt;/span&gt;
        &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;NotImplementedException&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The implementation details behind &lt;code&gt;TestApplicationHost&lt;/code&gt; are intentionally omitted here, because they are not the main point. What matters is that at the boundary, we are still describing behavior: the host starts, the API responds, health checks pass. Internally, &lt;code&gt;TestApplicationHost&lt;/code&gt; can wrap an &lt;code&gt;IHost&lt;/code&gt;, use Testcontainers, spin up a &lt;code&gt;WebApplicationFactory&lt;/code&gt;, or compose a full stack in Docker. The abstraction exists to let the behavior remain stable while infrastructure details evolve.&lt;/p&gt;

&lt;p&gt;This is the same pattern you used on the small scale with &lt;code&gt;UserNameDisplay&lt;/code&gt;, only now it operates at the level of the entire application. The outermost abstraction represents the system as it is experienced from the outside. Everything underneath exists to satisfy that experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  Declarative Core, Automated Edge
&lt;/h2&gt;

&lt;p&gt;Once specifications like these exist, they become the backbone of your automation. The natural end state is a development flow where:&lt;/p&gt;

&lt;p&gt;A new behavior is introduced as a TinyBDD scenario.&lt;br&gt;
 That scenario boots the application in a realistic but controlled environment.&lt;br&gt;
 The rest of the stack compiles, configures, and deploys itself into a test harness without manual intervention.&lt;br&gt;
 The same scenarios run in local, CI, and pre-production environments, with only configuration differing between them.&lt;/p&gt;

&lt;p&gt;The actual application code can then remain highly declarative. Controllers or handlers describe what should happen when a request arrives. Domain services express rules and policies in terms of value objects and aggregates. Infrastructure concerns hide behind interfaces and adapters. Source generators or templates can remove boilerplate around repetitive CRUD or mapping concerns. The tests remain focused on behavior: does the system do what we said it would do.&lt;/p&gt;

&lt;p&gt;Abstractions in this world are not ornamental. They are the scaffolding that holds the behavior in place while the infrastructure and implementation details shift around it. As long as the specifications stay clear and the boundaries remain aligned with the domain, you can move quickly without losing correctness. And if you ever find yourself adding yet another &lt;code&gt;UserNameDisplayStrategyFactoryFactory&lt;/code&gt; just to keep a scenario passing, you will at least have a clear, behavior-centric lens through which to recognize that something has gone wrong.&lt;/p&gt;

&lt;h2&gt;
  
  
  Design Patterns, Declarative Thinking, and Solving the General Case
&lt;/h2&gt;

&lt;p&gt;Before closing, there is one more point worth addressing, because it tends to resurface in every conversation about abstraction: the role of design patterns. Patterns are often misunderstood in practice. For some teams they become a dictionary of shapes to copy. For others they become a superstition, something to be avoided because they "feel enterprise." In reality, design patterns are nothing more than reusable expressions of relationships that occur frequently in software. They only become harmful when applied without context.&lt;/p&gt;

&lt;p&gt;Used well, patterns are a form of declarative modeling. They describe how things relate, not how many classes must be introduced to satisfy a template. This distinction is one reason I created &lt;a href="https://github.com/jerrettdavis/patternkit" rel="noopener noreferrer"&gt;PatternKit&lt;/a&gt;&lt;br&gt;
which contains fluent implementations of every GoF pattern. The aim is not to celebrate patterns for their own sake, but to show that they can be expressed clearly and idiomatically, without the ceremony that has accumulated around them over the past few decades. A fluent strategy or builder is readable because it conveys meaning, not because it adheres to a UML diagram. A properly shaped composite or decorator is useful because it matches the problem at hand, not because the catalog says "now is the time."&lt;/p&gt;

&lt;p&gt;Patterns at their best are accelerants for thought. They give structure to complex behavior. They reveal seams in the domain. They help us express intent without prescribing a particular class arrangement. When applied declaratively, patterns become lightweight tools that reinforce clarity rather than obstacles that obscure it.&lt;/p&gt;

&lt;p&gt;This is the same principle that guides good abstractions. We should always aim to solve the general case when appropriate, rather than re-solving the same narrow problem in twenty different places. Shared operations belong in helpers or extensions not because we want fewer lines of code, but because meaning belongs in one place rather than scattered across many. Wrapping behavior in a well-designed abstraction is not indulgence; it is about shaping the domain so the rest of the system can grow without friction. Once the domain is sufficiently modeled, higher-order helpers, pipelines, or policy objects can provide a unified vocabulary for orchestrating that domain. These are the moments when patterns shine: when they articulate a common structure behind several similar problems and offer a clean way to express the variation between them.&lt;/p&gt;

&lt;p&gt;Patterns should not be forced; they should emerge. If you find yourself retrofitting the domain to suit a pattern, the pattern is wrong. If a pattern clarifies the domain, it is the right one. When in doubt, your behaviors and your domain model will tell you the truth.&lt;/p&gt;

&lt;h2&gt;
  
  
  Closing Thoughts
&lt;/h2&gt;

&lt;p&gt;At every scale, from a one-line helper to a distributed system with dozens of services, the same principles hold. Begin with behavior. Shape abstractions around the real problems rather than your implementation preferences. Allow patterns to emerge naturally when they clarify meaning. Lean on automation and declarative structures to eliminate noise. Let stability arise from good boundaries rather than rigid frameworks. And above all, keep re-examining the domain as it evolves. Systems live for years; code is rewritten many times. The only reliable compass is the domain itself.&lt;/p&gt;

&lt;p&gt;You do not hate abstractions. What you hate are the wrong ones: misplaced layers, premature hierarchies, needless ceremony, half-understood patterns applied because someone did not want to think. But abstractions that arise from behavior and domain, shaped carefully and used intentionally, are not burdens. They are the tools that let us build systems that last.&lt;/p&gt;

&lt;p&gt;And if you stay anchored to this style of approach, you won’t end up navigating a small cavern system of strategies and factories and wondering how a middle initial turned into a guided tour of every pattern someone ever took a little too seriously. You won’t need to fear abstraction at all. Instead, you will wield it where it belongs.&lt;/p&gt;

</description>
      <category>architecture</category>
      <category>programming</category>
      <category>design</category>
      <category>development</category>
    </item>
    <item>
      <title>BDD: Make the Business Write Your Tests</title>
      <dc:creator>Jerrett Davis</dc:creator>
      <pubDate>Mon, 01 Sep 2025 22:48:53 +0000</pubDate>
      <link>https://dev.to/jerrettdavis/bdd-make-the-business-write-your-tests-27c9</link>
      <guid>https://dev.to/jerrettdavis/bdd-make-the-business-write-your-tests-27c9</guid>
      <description>&lt;p&gt;What if the business wrote the tests, and developers just made them pass?  &lt;/p&gt;

&lt;p&gt;That's the promise of behavior-driven development (BDD). Instead of developers guessing at requirements, chasing &lt;br&gt;
Slack threads, and interpreting vague Jira tickets, we let the people who know the &lt;em&gt;why&lt;/em&gt; express it in a form &lt;br&gt;
that's precise enough to execute.  &lt;/p&gt;

&lt;p&gt;Throughout my career, I've worked with teams of all shapes and sizes, and one pattern is universal:&lt;br&gt;&lt;br&gt;
nobody &lt;em&gt;loves&lt;/em&gt; writing tests. Most developers grudgingly agree they're important, but tests are often seen &lt;br&gt;
as a tax—time spent writing code that doesn't "ship features."  &lt;/p&gt;

&lt;p&gt;The result is predictable: coverage gaps, brittle suites, and requirements that live in Confluence but &lt;br&gt;
never make it into code. Testing becomes a burden instead of a superpower.&lt;/p&gt;

&lt;p&gt;BDD flips that dynamic. Instead of treating tests as a chore, it turns them into a shared language between &lt;br&gt;
developers, QA, and the business. Suddenly, everyone is speaking the same language about features and &lt;br&gt;
scenarios, and tests become the living documentation of what the system is supposed to do.  &lt;/p&gt;

&lt;p&gt;Before we dive into how this works, let's establish a few "Laws of Testing." These aren't divine truths, &lt;br&gt;
but they're a solid starting point—guidelines that help ensure tests actually drive design instead of &lt;br&gt;
just describing what already exists.&lt;/p&gt;
&lt;h3&gt;
  
  
  The Laws of Testing
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;No application code will be written until tests are defined.
&lt;/li&gt;
&lt;li&gt;Systems should be tested as they are meant to be used.
&lt;/li&gt;
&lt;li&gt;Automate tests to the maximum extent technically and financially feasible.
&lt;/li&gt;
&lt;li&gt;Every business requirement must have a corresponding test. If the requirement changes, the test must change.
&lt;/li&gt;
&lt;li&gt;Any code not covered by realistic, automated tests should be treated as magic—and magic should not be trusted.
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Notice that I'm not prescribing a specific tool, framework, or level of granularity. These laws simply give you &lt;br&gt;
a pragmatic foundation to make testing a first-class part of building software, not an afterthought.  &lt;/p&gt;
&lt;h2&gt;
  
  
  Behavior-Driven Development
&lt;/h2&gt;

&lt;p&gt;Now for the fun part: putting this into practice.  &lt;/p&gt;

&lt;p&gt;Behavior-Driven Development (&lt;em&gt;BDD&lt;/em&gt;) is designed to bridge the gap between business stakeholders, users, developers, &lt;br&gt;
QA, and everyone else involved in a project. There are many flavors of BDD, but at their core, they all focus on telling &lt;br&gt;
cohesive, testable stories using a shared Domain-Specific Language (DSL).  &lt;/p&gt;

&lt;p&gt;There's a long-standing joke in IT—still true even in the post-ChatGPT era—that it's easier to teach a subject-matter &lt;br&gt;
expert to code than it is to get them to clearly articulate what they actually want built.  &lt;/p&gt;

&lt;p&gt;Over the years, countless tools have promised to help non-technical folks bring their ideas to life, but theory and &lt;br&gt;
practice rarely align. &lt;em&gt;Everything works in theory.&lt;/em&gt; We can keep building better tools to bridge the gap between &lt;br&gt;
decision-makers and developers, but no tool can prevent the inevitable: &lt;strong&gt;requirements change&lt;/strong&gt;.  &lt;/p&gt;

&lt;p&gt;Business needs evolve constantly. Even in a perfectly stable market, a company's technology stack would still have to &lt;br&gt;
adapt—whether to security updates, API deprecations, or new compliance requirements. That's why business and technology &lt;br&gt;
teams need a living contract: a shared, human-readable specification of what's supposed to exist, expressed in language &lt;br&gt;
that everyone can understand.  &lt;/p&gt;

&lt;p&gt;BDD provides exactly that. It takes the "what" from the business and turns it into executable specifications for developers, &lt;br&gt;
ensuring the system stays aligned with today's needs—not just the assumptions written down last quarter.&lt;/p&gt;
&lt;h2&gt;
  
  
  What BDD Actually Is
&lt;/h2&gt;

&lt;p&gt;Behavior-Driven Development (BDD) is essentially turning your acceptance criteria into executable code.  &lt;/p&gt;

&lt;p&gt;If your company already writes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Acceptance Criteria in Jira or Azure DevOps&lt;/li&gt;
&lt;li&gt;Business Requirement Documents (BRDs) in Word or Confluence&lt;/li&gt;
&lt;li&gt;Test Scripts for QA teams to follow manually&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;…you're already halfway there. BDD simply takes those artifacts, expresses them in a precise, human-readable format, &lt;br&gt;
and wires them into your test suite so they can be run automatically.  &lt;/p&gt;

&lt;p&gt;Instead of existing only as text that humans must interpret, your requirements become living specifications that &lt;br&gt;
are always in sync with what the system actually does. If the system drifts, the tests fail — letting you catch gaps &lt;br&gt;
before production users do.&lt;/p&gt;

&lt;p&gt;Key ideas:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Ubiquitous language&lt;/strong&gt;: agree on domain terms and reuse them in requirements, tests, and code.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Outside-in&lt;/strong&gt;: start from observable outcomes (what users and the business care about) and let those drive your implementation.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Executable documentation&lt;/strong&gt;: your feature files become a source of truth that stays up to date because failing tests 
force the team to update them when the business changes its mind.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Why It Clicks with the Business
&lt;/h2&gt;

&lt;p&gt;BDD speaks the same language the business already uses.  &lt;/p&gt;

&lt;p&gt;Instead of:  &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"Verify that gold members receive free shipping on orders over $10." &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Buried in a PDF or Confluence page, we can write:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight gherkin"&gt;&lt;code&gt;&lt;span class="kn"&gt;Scenario&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; Gold member gets free shipping
  &lt;span class="nf"&gt;Given &lt;/span&gt;the customer is a &lt;span class="s"&gt;"gold"&lt;/span&gt; member
  &lt;span class="nf"&gt;And &lt;/span&gt;they have a cart totaling $12.00
  &lt;span class="nf"&gt;When &lt;/span&gt;they checkout with standard shipping
  &lt;span class="nf"&gt;Then &lt;/span&gt;the shipping cost is $0.00
  &lt;span class="nf"&gt;And &lt;/span&gt;the order total is $12.00
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Stakeholders can read and confirm this just like they'd review an acceptance test — but now this doubles as a&lt;br&gt;
machine-executable test that can be run in CI.&lt;/p&gt;
&lt;h2&gt;
  
  
  How to Start Without Overhauling Everything
&lt;/h2&gt;

&lt;p&gt;You don't need to replace all your documentation overnight. Try this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Take an existing acceptance criterion for a small feature.&lt;/li&gt;
&lt;li&gt;Rewrite it as a Gherkin scenario.&lt;/li&gt;
&lt;li&gt;Wire up just enough step definitions to make it run.&lt;/li&gt;
&lt;li&gt;Let it fail (red).&lt;/li&gt;
&lt;li&gt;Write or adjust code until it passes (green).&lt;/li&gt;
&lt;li&gt;Refactor code &lt;em&gt;and&lt;/em&gt; scenario until both are clear and maintainable.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That's it — you've just done outside-in development. Over time, you can replace more of your static BRDs and&lt;br&gt;
test scripts with living specs that run automatically and stay current.&lt;/p&gt;



&lt;p&gt;By framing BDD as an evolution of what teams already do, it becomes less intimidating: you're not adding "one more thing,"&lt;br&gt;
you're making what you already write executable, consistent, and always up to date.&lt;/p&gt;
&lt;h2&gt;
  
  
  Starting from Existing Code (The Developer + QA Perspective)
&lt;/h2&gt;

&lt;p&gt;Of course, not every team has pristine requirements or well-maintained acceptance criteria.&lt;br&gt;&lt;br&gt;
Sometimes the &lt;em&gt;only&lt;/em&gt; source of truth is the code itself, or a set of outdated manual test scripts in a shared folder.&lt;br&gt;&lt;br&gt;
That's okay — you can still apply BDD principles to what you already have.  &lt;/p&gt;
&lt;h3&gt;
  
  
  Step 1: Surface What the Code Already Does
&lt;/h3&gt;

&lt;p&gt;Start by exploring the system from the outside in, and not by reading the code first.  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Walk through the application like a user would: click buttons, submit forms, run API calls.
&lt;/li&gt;
&lt;li&gt;Write down what you observe in plain language, almost as if you were writing a support guide.
&lt;/li&gt;
&lt;li&gt;Identify key flows: logins, purchases, data exports, admin tasks, error handling.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You're essentially reverse-engineering the behavior that already exists. The goal isn't to perfectly model the internals, but to describe what happens when X occurs.&lt;/p&gt;
&lt;h3&gt;
  
  
  Step 2: Capture Behavior as Scenarios
&lt;/h3&gt;

&lt;p&gt;Turn those observations into scenarios, even if they're manual at first.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight gherkin"&gt;&lt;code&gt;&lt;span class="kn"&gt;Scenario&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; User can reset password
  &lt;span class="nf"&gt;Given &lt;/span&gt;I am on the login page
  &lt;span class="nf"&gt;When &lt;/span&gt;I click &lt;span class="s"&gt;"Forgot password"&lt;/span&gt;
  &lt;span class="nf"&gt;And &lt;/span&gt;I submit my email address
  &lt;span class="nf"&gt;Then &lt;/span&gt;I receive a password reset link by email
  &lt;span class="nf"&gt;And &lt;/span&gt;I can set a new password
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;br&gt;
`&lt;/p&gt;

&lt;p&gt;This becomes your backfill documentation and a checklist you can use to validate future changes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Choose a Testing Library
&lt;/h3&gt;

&lt;p&gt;Once you have a handful of scenarios, pick a suitable tool that matches your tech stack.&lt;br&gt;
It doesn't have to be fancy — choose something that can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Parse Gherkin or similar DSL (&lt;a href="https://cucumber.io/docs/installation/" rel="noopener noreferrer"&gt;Cucumber&lt;/a&gt;, SpecFlow/&lt;a href="https://reqnroll.net/" rel="noopener noreferrer"&gt;Reqnroll&lt;/a&gt;, &lt;a href="https://behave.readthedocs.io/en/latest/" rel="noopener noreferrer"&gt;Behave&lt;/a&gt;, &lt;a href="https://github.com/JerrettDavis/TinyBDD" rel="noopener noreferrer"&gt;TinyBDD&lt;/a&gt; 😉, etc.)&lt;/li&gt;
&lt;li&gt;Run in your CI/CD pipeline&lt;/li&gt;
&lt;li&gt;Integrate with the type of app you have (UI automation, API tests, service-level tests)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Don't over-engineer at this stage. The goal is simply to make a scenario run end-to-end.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Build Thin Step Definitions
&lt;/h3&gt;

&lt;p&gt;When you wire up steps to code, keep the step definitions thin and reusable:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Treat them like glue code — they orchestrate, not implement.&lt;/li&gt;
&lt;li&gt;Push logic into abstractions (page objects, API clients, domain helpers).&lt;/li&gt;
&lt;li&gt;Keep language aligned with the business terms you captured earlier.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This separation makes your steps easy to read and your automation maintainable even as your app changes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Automate the Most Valuable Flows First
&lt;/h3&gt;

&lt;p&gt;Don't try to automate everything on day one. Pick a few high-value, low-volatility flows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Happy-path checkouts&lt;/li&gt;
&lt;li&gt;Core login and authentication&lt;/li&gt;
&lt;li&gt;Critical reporting or data pipelines&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Start small, get them running reliably in CI, and expand gradually. The point is not to hit 100% coverage overnight, but to start to &lt;/p&gt;

&lt;h2&gt;
  
  
  gain comfort and momentum in writing BDD tests end-to-end.
&lt;/h2&gt;

&lt;p&gt;By working backward from the app's behavior, you create a bridge between what the code does and what the business expects &lt;br&gt;
even when no documentation exists. Over time, your automated scenarios become the new source of truth, letting developers and QA refactor or ship new features with confidence.&lt;/p&gt;

&lt;h2&gt;
  
  
  Bringing It All Together
&lt;/h2&gt;

&lt;p&gt;Whether you're starting from crisp acceptance criteria or working backwards from a codebase that only lives in developers' heads, &lt;br&gt;
the goal of BDD is the same: &lt;strong&gt;close the gap between what the business needs and what the code does&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;You don't have to overhaul your entire testing strategy in a single sprint. You just have to start:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Write one scenario.&lt;/li&gt;
&lt;li&gt;Get it running end-to-end.&lt;/li&gt;
&lt;li&gt;Share it with your team.&lt;/li&gt;
&lt;li&gt;Keep going.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Over time, you'll build up a living specification that grows with the system, catching regressions early and making onboarding new developers dramatically easier.  &lt;/p&gt;

&lt;h3&gt;
  
  
  What a Mature BDD Practice Looks Like
&lt;/h3&gt;

&lt;p&gt;A well-adopted BDD process creates a feedback loop that keeps everyone aligned:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Business &amp;amp; product teams&lt;/strong&gt; write or review scenarios as part of refinement.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Developers&lt;/strong&gt; implement features by making those scenarios pass.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;QA&lt;/strong&gt; contributes new scenarios for edge cases and validates existing ones stay green.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CI/CD pipelines&lt;/strong&gt; run the whole suite automatically, so everyone knows the current state of the system at a glance.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When this loop is healthy, you get a shared understanding of what "done" really means, and confidence that your software still works tomorrow, next quarter, and next year.&lt;/p&gt;

&lt;h3&gt;
  
  
  Final Thoughts
&lt;/h3&gt;

&lt;p&gt;BDD isn't a silver bullet, but it &lt;em&gt;is&lt;/em&gt; a forcing function for clearer requirements, more reliable software, and tighter collaboration across teams.  &lt;/p&gt;

&lt;p&gt;If you've ever wished the business could "just write the tests," BDD is the closest thing we have to that dream. Start small, &lt;br&gt;
stay consistent, and watch as those scenarios turn into a living, breathing specification that guides your development for years to come.&lt;/p&gt;

</description>
      <category>programming</category>
      <category>testing</category>
      <category>coding</category>
      <category>devops</category>
    </item>
    <item>
      <title>What is "Code"?</title>
      <dc:creator>Jerrett Davis</dc:creator>
      <pubDate>Mon, 01 Sep 2025 22:35:00 +0000</pubDate>
      <link>https://dev.to/jerrettdavis/what-is-code-1o4o</link>
      <guid>https://dev.to/jerrettdavis/what-is-code-1o4o</guid>
      <description>&lt;p&gt;Technology is a wonderful thing. Humans, and perhaps even our direct ancestors, have been employing it for millennia. From stone tools to quantum computers, humans can't seem to resist tinkering with whatever the universe gives them.&lt;/p&gt;

&lt;p&gt;And with even our most remote, uncontacted relatives still in possession of weapons and shelter, it's safe to say every person on this planet experiences technology from birth to their final days. Right now, as you read this, you're surrounded by, and literally in contact with, an almost uncountable web of human inventions.&lt;/p&gt;

&lt;p&gt;Technology shapes our lives, and in turn, we shape its future. It can be used for feats of great creation or acts of horrific destruction. Among all its branches, one of the most powerful and most accessible is &lt;strong&gt;code&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;You don't need a factory, a lab, or a workshop to work with code. All it takes is a computer and a spark of curiosity. If you've ever stacked LEGO bricks into something from your imagination, built a spreadsheet formula to save hours of work, set a coffee maker to brew before you wake, or fixed something in a pinch with whatever was at hand, you've tapped into the same problem-solving instinct that drives programming.&lt;/p&gt;

&lt;p&gt;At its heart, coding is the act of taking what you have and arranging it into something that does what you want. It's imagination with rules, creativity with feedback. And once you've seen something you've built come alive, whether it's a blinking LED, a tool that saves someone's day, or an app that makes a stranger smile, it's hard not to want to build more.&lt;/p&gt;




&lt;h2&gt;
  
  
  What is "Code"?
&lt;/h2&gt;

&lt;p&gt;Code comes in many flavors. There are programming languages like C#, Assembly, C++, Rust, Python, and countless others, each with its own syntax, style, and purpose. There's &lt;strong&gt;G-code&lt;/strong&gt;, the set of instructions understood by CNC mills, 3D printers, lasers, and other computer-controlled machines. Even music notation is a kind of code, a written sequence of symbols that, when interpreted, produces something meaningful.&lt;/p&gt;

&lt;p&gt;The “code” most familiar to all of us is &lt;strong&gt;human language&lt;/strong&gt;. Spoken or written, it can form relationships, transfer knowledge, and inspire change. But language is inherently ambiguous, which makes it ill-suited for telling computers exactly what to do. That's why programming languages exist: precise dialects designed for humans to read and for machines to execute without hesitation or confusion.&lt;/p&gt;

&lt;p&gt;Put simply:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Code is a system of signals, symbols, letters, words, or other constructs used to convey a message.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;At the very bottom of every modern computer lies &lt;strong&gt;binary&lt;/strong&gt; — two values, usually 1 and 0, representing “on” and “off” or “true” and “false.” Working directly in binary is tedious, so we created layers of abstraction to make our lives easier.&lt;/p&gt;

&lt;p&gt;A step above binary is &lt;strong&gt;assembly language&lt;/strong&gt;. Assembly comes in many dialects, each tied to the instruction set of a specific processor. It's still “close to the metal” but far more readable than raw ones and zeros.&lt;/p&gt;




&lt;h3&gt;
  
  
  A (Somewhat Accurate and Still Overly Dramatic) History of Programming Languages
&lt;/h3&gt;

&lt;p&gt;The history of programming languages is a tapestry of invention, frustration, and the occasional all-nighter. Imagine it told as a fantasy epic.&lt;/p&gt;

&lt;p&gt;First came &lt;strong&gt;Binary&lt;/strong&gt;, and from Binary were forged all other tongues of the machine. But the work was slow, and the scribes of silicon longed for an easier way to command their creations.&lt;/p&gt;

&lt;p&gt;In the early 1950s, pioneers began to shape new languages: &lt;strong&gt;Regional Assembly Language&lt;/strong&gt; (1951), &lt;strong&gt;Autocode&lt;/strong&gt; (1952), and &lt;strong&gt;IPL&lt;/strong&gt; (1954), the forerunner to LISP. Grace Hopper's &lt;strong&gt;FLOW-MATIC&lt;/strong&gt; (1955) would pave the way to &lt;strong&gt;COBOL&lt;/strong&gt; (1959), while &lt;strong&gt;FORTRAN&lt;/strong&gt; (1957) brought the first compiler to life. The 1960s saw the birth of &lt;strong&gt;LISP&lt;/strong&gt; (1958), &lt;strong&gt;ALGOL&lt;/strong&gt; (1958, 1960), &lt;strong&gt;BASIC&lt;/strong&gt; (1964), &lt;strong&gt;PL/I&lt;/strong&gt; (1964), &lt;strong&gt;BCPL&lt;/strong&gt; (1967), and &lt;strong&gt;B&lt;/strong&gt; (1969), leading to the mighty &lt;strong&gt;C&lt;/strong&gt; (1972).&lt;/p&gt;

&lt;p&gt;From the 1970s onward, the floodgates opened: &lt;strong&gt;Pascal&lt;/strong&gt; (1970), &lt;strong&gt;Smalltalk&lt;/strong&gt; (1972), &lt;strong&gt;Prolog&lt;/strong&gt; (1972), &lt;strong&gt;SQL&lt;/strong&gt; (1978), &lt;strong&gt;C++&lt;/strong&gt; (1980), &lt;strong&gt;Ada&lt;/strong&gt; (1983), &lt;strong&gt;Perl&lt;/strong&gt; (1987), &lt;strong&gt;Python&lt;/strong&gt; (1991), &lt;strong&gt;Java&lt;/strong&gt; (1995), &lt;strong&gt;JavaScript&lt;/strong&gt; (1995), &lt;strong&gt;Ruby&lt;/strong&gt; (1995), &lt;strong&gt;PHP&lt;/strong&gt; (1995), and many more.&lt;/p&gt;

&lt;p&gt;Each era brought its own heroes, philosophies, and quirks. Some languages were swift and elegant, others sprawling and stubborn. But all of them, in their way, expanded what was possible.&lt;/p&gt;

&lt;p&gt;Today, we live in an age of abundance. We have languages, frameworks, and tools for nearly every conceivable task. And if the exact tool you need doesn't exist, the beauty of code is that you can create it yourself.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Code Matters
&lt;/h2&gt;

&lt;p&gt;Code is power, not in the dystopian sense, but in the sense that it lets one person shape behavior, automate work, and solve problems in ways that scale far beyond their own hands.&lt;/p&gt;

&lt;p&gt;With code, you can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Automate boring tasks&lt;/li&gt;
&lt;li&gt;Create tools that help people&lt;/li&gt;
&lt;li&gt;Control machines in the physical world&lt;/li&gt;
&lt;li&gt;Build art, games, and interactive experiences&lt;/li&gt;
&lt;li&gt;Invent entirely new kinds of technology&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You don't need to be a professional programmer to benefit from learning it. Even small bits of code can make life easier, save time, or open the door to entirely new possibilities.&lt;/p&gt;




&lt;h2&gt;
  
  
  What's Next?
&lt;/h2&gt;

&lt;p&gt;This post answered “What is code?” in broad strokes: its forms, history, and why it matters. In the next part of this series, we'll start laying the &lt;strong&gt;foundations of programming&lt;/strong&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Variables &amp;amp; Data Types&lt;/strong&gt; — the labeled containers of programming.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Operators&lt;/strong&gt; — the tools for working with your data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Logic &amp;amp; Flow&lt;/strong&gt; — the recipes that decide what happens when.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;By the time we're done, you'll not just &lt;em&gt;know&lt;/em&gt; what code is, you'll be able to write it, read it, and use it to build something that matters to you.&lt;/p&gt;

</description>
      <category>programming</category>
      <category>coding</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Automating Nightly Local Database Refreshes from Azure Blob Storage with Docker</title>
      <dc:creator>Jerrett Davis</dc:creator>
      <pubDate>Thu, 29 Feb 2024 16:54:28 +0000</pubDate>
      <link>https://dev.to/jerrettdavis/automating-nightly-local-database-refreshes-from-azure-blob-storage-with-docker-3api</link>
      <guid>https://dev.to/jerrettdavis/automating-nightly-local-database-refreshes-from-azure-blob-storage-with-docker-3api</guid>
      <description>&lt;ul&gt;
&lt;li&gt;Background&lt;/li&gt;
&lt;li&gt;Prerequisites&lt;/li&gt;
&lt;li&gt;
Setting up the Docker Container

&lt;ul&gt;
&lt;li&gt;Base Dockerfile&lt;/li&gt;
&lt;li&gt;Install Azure CLI&lt;/li&gt;
&lt;li&gt;Install SqlPackage&lt;/li&gt;
&lt;li&gt;Entrypoint&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

The Scripts

&lt;ul&gt;
&lt;li&gt;entrypoint.sh&lt;/li&gt;
&lt;li&gt;initialize-database-and-jobs.sh&lt;/li&gt;
&lt;li&gt;download-latest.sh&lt;/li&gt;
&lt;li&gt;enable-authentication.sql&lt;/li&gt;
&lt;li&gt;reimport-database.sh&lt;/li&gt;
&lt;li&gt;kill-all-connections.sql&lt;/li&gt;
&lt;li&gt;Final Directory Structure and Dockerfile Changes&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Building the Docker Container&lt;/li&gt;

&lt;li&gt;Testing the Container&lt;/li&gt;

&lt;li&gt;Conclusion&lt;/li&gt;

&lt;/ul&gt;

&lt;h1&gt;
  
  
  Background
&lt;/h1&gt;

&lt;p&gt;In cloud-hosted applications, it is common to restrict access to production databases. This is a good practice, but it can&lt;br&gt;
make it difficult for various teams to access the data they need for development, testing, reporting, and data analysis.&lt;br&gt;
One way to solve this problem is to create a daily process that copies the production database to a location that is accessible&lt;br&gt;
to the teams that need it. In this article, we will create a docker container that will download the latest backup of a&lt;br&gt;
production database from Azure Blob Storage and restore it to a local SQL Server instance.&lt;/p&gt;
&lt;h1&gt;
  
  
  Prerequisites
&lt;/h1&gt;

&lt;p&gt;This article assumes that you have already set up an automation to back up your production database to Azure Blob Storage.&lt;br&gt;
If you have not done this, you can follow the instructions in this article: &lt;a href="https://techcommunity.microsoft.com/t5/azure-database-support-blog/automate-exporting-of-azure-sql-database-as-bacpac-to-blog/ba-p/2409213" rel="noopener noreferrer"&gt;Automating Nightly Database Backups to Azure Blob Storage&lt;/a&gt;.&lt;br&gt;
The script in the article is reaching end of life, and an updated version can be found &lt;a href="https://github.com/josegomera/AzureAutomation/tree/master/scripts/sqlDatabase" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Docker must be installed and configured to used Linux containers on your local machine. If you do not&lt;br&gt;
have Docker installed, you can download it from &lt;a href="https://www.docker.com/products/docker-desktop" rel="noopener noreferrer"&gt;Docker's website&lt;/a&gt;.&lt;/p&gt;
&lt;h1&gt;
  
  
  Setting up the Docker Container
&lt;/h1&gt;

&lt;p&gt;The first step is to create a Dockerfile that will be used to build the container. The Dockerfile will contain the instructions&lt;br&gt;
on how to configure the container and what commands to run when the container is started. For our environment, we will&lt;br&gt;
require Microsoft SQL Server to be installed in the container. We will use the official Microsoft SQL Server Docker image as a base.&lt;/p&gt;
&lt;h2&gt;
  
  
  Base Dockerfile
&lt;/h2&gt;

&lt;p&gt;Create a new directory on your local machine and create a file called &lt;code&gt;Dockerfile&lt;/code&gt; in the directory. Add the following content to the file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="k"&gt;FROM&lt;/span&gt;&lt;span class="s"&gt; mcr.microsoft.com/mssql/server:2022-CU11-ubuntu-22.04&lt;/span&gt;

&lt;span class="c"&gt;# Set environment variables for the container&lt;/span&gt;
&lt;span class="k"&gt;ARG&lt;/span&gt;&lt;span class="s"&gt; ACCOUNT_NAME&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; ACCOUNT_NAME=$ACCOUNT_NAME&lt;/span&gt;
&lt;span class="k"&gt;ARG&lt;/span&gt;&lt;span class="s"&gt; ACCOUNT_KEY&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; ACCOUNT_KEY=$ACCOUNT_KEY&lt;/span&gt;
&lt;span class="k"&gt;ARG&lt;/span&gt;&lt;span class="s"&gt; CONTAINER_NAME&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; CONTAINER_NAME=$CONTAINER_NAME&lt;/span&gt;

&lt;span class="k"&gt;ARG&lt;/span&gt;&lt;span class="s"&gt; CRON_SCHEDULE="0 4 * * *"&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; CRON_SCHEDULE=$CRON_SCHEDULE&lt;/span&gt;

&lt;span class="k"&gt;ARG&lt;/span&gt;&lt;span class="s"&gt; DATABASE_NAME=MyDatabase&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; DATABASE_NAME=$DATABASE_NAME&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; MSSQL_SA_PASSWORD=yourStrong(!)Password&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; ACCEPT_EULA=Y&lt;/span&gt;
&lt;span class="k"&gt;ENV&lt;/span&gt;&lt;span class="s"&gt; MSSQL_PID=Developer&lt;/span&gt;

&lt;span class="c"&gt;# Create a working directory for our tools and scripts and copy all the files from the host machine to the container&lt;/span&gt;
&lt;span class="k"&gt;COPY&lt;/span&gt;&lt;span class="s"&gt; . /sql&lt;/span&gt;
&lt;span class="k"&gt;WORKDIR&lt;/span&gt;&lt;span class="s"&gt; /sql&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Install Azure CLI
&lt;/h2&gt;

&lt;p&gt;To find and download the latest backup of the production database, we will need to install the Azure CLI in the container.&lt;br&gt;
We'll also need wget, cron, unzip, and a few other utilities to help us automate the process. Microsoft does offer a script&lt;br&gt;
to install the Azure CLI, but in my testing it did not work as expected in the Docker container. Instead, we will use&lt;br&gt;
the &lt;a href="https://learn.microsoft.com/en-us/cli/azure/install-azure-cli-linux?pivots=apt#option-2-step-by-step-installation-instructions" rel="noopener noreferrer"&gt;manual installation instructions&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Update the &lt;code&gt;Dockerfile&lt;/code&gt; to include the following lines after the &lt;code&gt;WORKDIR&lt;/code&gt; line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="c"&gt;# Must be root to install packages&lt;/span&gt;
&lt;span class="k"&gt;USER&lt;/span&gt;&lt;span class="s"&gt; root&lt;/span&gt;

&lt;span class="c"&gt;# Install Dependencies&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;apt-get update &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    apt-get &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="nt"&gt;--no-install-recommends&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    unzip cron wget apt-transport-https &lt;span class="se"&gt;\
&lt;/span&gt;    software-properties-common ca-certificates curl &lt;span class="se"&gt;\
&lt;/span&gt;    apt-transport-https lsb-release gnupg &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;-rf&lt;/span&gt; /var/lib/apt/lists/&lt;span class="k"&gt;*&lt;/span&gt;

&lt;span class="c"&gt;# Install az cli&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;&lt;span class="nb"&gt;mkdir&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; /etc/apt/keyrings &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    curl &lt;span class="nt"&gt;-sLS&lt;/span&gt; https://packages.microsoft.com/keys/microsoft.asc | &lt;span class="se"&gt;\
&lt;/span&gt;        gpg &lt;span class="nt"&gt;--dearmor&lt;/span&gt; | &lt;span class="se"&gt;\
&lt;/span&gt;        &lt;span class="nb"&gt;tee&lt;/span&gt; /etc/apt/keyrings/microsoft.gpg &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; /dev/null &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;chmod &lt;/span&gt;go+r /etc/apt/keyrings/microsoft.gpg &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nv"&gt;AZ_DIST&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;lsb_release &lt;span class="nt"&gt;-cs&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"deb [arch=amd64 signed-by=/etc/apt/keyrings/microsoft.gpg] https://packages.microsoft.com/repos/azure-cli/ &lt;/span&gt;&lt;span class="nv"&gt;$AZ_DIST&lt;/span&gt;&lt;span class="s2"&gt; main"&lt;/span&gt; | &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;tee&lt;/span&gt; /etc/apt/sources.list.d/azure-cli.list &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    apt-get update &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    apt-get &lt;span class="nb"&gt;install &lt;/span&gt;azure-cli &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;-rf&lt;/span&gt; /var/lib/apt/lists/&lt;span class="k"&gt;*&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Install SqlPackage
&lt;/h2&gt;

&lt;p&gt;To import the backup bacpac file that we're going to download from Azure Blob Storage, we will need to install the &lt;code&gt;sqlpackage&lt;/code&gt; utility.&lt;br&gt;
This utility is used to import and export bacpac files to and from SQL Server. The utility has &lt;a href="https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver16#supported-operating-systems" rel="noopener noreferrer"&gt;evergreen links&lt;/a&gt;&lt;br&gt;
available, so we can use the link to download the latest version of the utility.&lt;/p&gt;

&lt;p&gt;Add the following lines to the end of the &lt;code&gt;Dockerfile&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="c"&gt;# Install SQLPackage for Linux and make it executable&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;wget &lt;span class="nt"&gt;-q&lt;/span&gt; &lt;span class="nt"&gt;-O&lt;/span&gt; sqlpackage.zip https://aka.ms/sqlpackage-linux &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; unzip &lt;span class="nt"&gt;-qq&lt;/span&gt; sqlpackage.zip &lt;span class="nt"&gt;-d&lt;/span&gt; /sql/sqlpackage &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;chmod&lt;/span&gt; +x /sql/sqlpackage/sqlpackage &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;rm &lt;/span&gt;sqlpackage.zip
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above lines download the latest version of the &lt;code&gt;sqlpackage&lt;/code&gt; utility from Microsoft's website, unzip it, make it executable, and then remove the zip file.&lt;/p&gt;

&lt;h2&gt;
  
  
  Entrypoint
&lt;/h2&gt;

&lt;p&gt;Finally, we need to add our entrypoint script to the container. This script will be run when the container starts, and it&lt;br&gt;
will perform all the necessary steps to download the latest backup from Azure Blob Storage and restore it to the local SQL Server instance.&lt;/p&gt;

&lt;p&gt;Add the following lines to the end of the &lt;code&gt;Dockerfile&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="c"&gt;# Switch back to mssql user&lt;/span&gt;
&lt;span class="k"&gt;USER&lt;/span&gt;&lt;span class="s"&gt; mssql&lt;/span&gt;

&lt;span class="k"&gt;EXPOSE&lt;/span&gt;&lt;span class="s"&gt; 1433&lt;/span&gt;

&lt;span class="k"&gt;CMD&lt;/span&gt;&lt;span class="s"&gt; /bin/bash ./entrypoint.sh&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We need to switch back to the &lt;code&gt;mssql&lt;/code&gt; user to run the SQL Server process, so we make use of the &lt;code&gt;USER&lt;/code&gt; command to do this.&lt;br&gt;
The &lt;code&gt;EXPOSE&lt;/code&gt; command tells Docker that the container listens on the specified network ports at runtime.&lt;br&gt;
The &lt;code&gt;CMD&lt;/code&gt; command specifies the command that will be run when the container starts. In this case, we are running the &lt;code&gt;entrypoint.sh&lt;/code&gt; script.&lt;/p&gt;

&lt;p&gt;We will make some final changes to our &lt;code&gt;Dockerfile&lt;/code&gt; later, but for now, save the file and close it.&lt;/p&gt;
&lt;h1&gt;
  
  
  The Scripts
&lt;/h1&gt;

&lt;p&gt;As we alluded to at the end of our &lt;code&gt;Dockerfile&lt;/code&gt;, we need to create an &lt;code&gt;entrypoint.sh&lt;/code&gt; script that will be run when the container starts.&lt;br&gt;
Since this container is based on the official Microsoft SQL Server Docker image, we need to ensure the original entrypoint is also run&lt;br&gt;
alongside our custom entrypoint script. To do this, we need to create an additional script that we will call in our &lt;code&gt;entrypoint.sh&lt;/code&gt; script.&lt;/p&gt;
&lt;h2&gt;
  
  
  entrypoint.sh
&lt;/h2&gt;

&lt;p&gt;Create two new files in the same directory as your &lt;code&gt;Dockerfile&lt;/code&gt; called &lt;code&gt;entrypoint.sh&lt;/code&gt; and &lt;code&gt;initialize-database-and-jobs.sh&lt;/code&gt;.&lt;br&gt;
Add the following content to &lt;code&gt;entrypoint.sh&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;
/sql/initialize-database-and-jobs.sh &amp;amp; /opt/mssql/bin/sqlservr
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You'll note that we are running the &lt;code&gt;initialize-database-and-jobs.sh&lt;/code&gt; script in the background and then starting the SQL Server process.&lt;br&gt;
This &lt;code&gt;&amp;amp;&lt;/code&gt; syntax is necessary to ensure that the original entrypoint script is also run without the docker container exiting immediately after&lt;br&gt;
the script completes.&lt;/p&gt;
&lt;h2&gt;
  
  
  initialize-database-and-jobs.sh
&lt;/h2&gt;

&lt;p&gt;Add the following content to &lt;code&gt;initialize-database-and-jobs.sh&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="c"&gt;# wait 30 seconds for SQL Server to start up&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Waiting for SQL Server to start"&lt;/span&gt;
&lt;span class="nb"&gt;sleep &lt;/span&gt;30s

&lt;span class="c"&gt;# Download the bacpac file from the Azure Blob Storage&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Downloading bacpac file from Azure Blob Storage"&lt;/span&gt;
bash /sql/download-latest.sh &lt;span class="nv"&gt;$ACCOUNT_NAME&lt;/span&gt; &lt;span class="nv"&gt;$ACCOUNT_KEY&lt;/span&gt; &lt;span class="nv"&gt;$CONTAINER_NAME&lt;/span&gt; /sql/backup.bacpac
&lt;span class="nv"&gt;backupJob&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$?&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$backupJob&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="nt"&gt;-eq&lt;/span&gt; 0 &lt;span class="o"&gt;]&lt;/span&gt;
&lt;span class="k"&gt;then
    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Successfully downloaded bacpac file from Azure Blob Storage!"&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Enabling SQL Server authentication..."&lt;/span&gt;
    /opt/mssql-tools/bin/sqlcmd &lt;span class="nt"&gt;-S&lt;/span&gt; localhost &lt;span class="nt"&gt;-U&lt;/span&gt; sa &lt;span class="nt"&gt;-P&lt;/span&gt; &lt;span class="nv"&gt;$MSSQL_SA_PASSWORD&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; master &lt;span class="nt"&gt;-i&lt;/span&gt; /sql/enable-authentication.sql
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"SQL Server authentication enabled. Waiting for 10 seconds before importing the bacpac file..."&lt;/span&gt;
    &lt;span class="nb"&gt;sleep &lt;/span&gt;10s

    &lt;span class="c"&gt;# Import the bacpac file into the SQL Server&lt;/span&gt;
    /sql/sqlpackage/sqlpackage /a:import /sf:/sql/backup.bacpac /tsn:localhost,1433 /tdn:&lt;span class="nv"&gt;$DATABASE_NAME&lt;/span&gt; /tu:sa /tp:&lt;span class="nv"&gt;$MSSQL_SA_PASSWORD&lt;/span&gt; /ttsc:True

    &lt;span class="c"&gt;# Set up 4am CRON job to re-import the database&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$CRON_SCHEDULE&lt;/span&gt;&lt;span class="s2"&gt; /bin/bash /sql/reimport-database.sh"&lt;/span&gt; | crontab -
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"CRON job set up successfully"&lt;/span&gt;
    &lt;span class="nb"&gt;exit &lt;/span&gt;0
&lt;span class="k"&gt;else
    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Failed to download bacpac file from Azure Blob Storage"&lt;/span&gt;
    &lt;span class="nb"&gt;exit &lt;/span&gt;1
&lt;span class="k"&gt;fi&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This short script does quite a few operations. First, it's recommended to wait for SQL Server to start up before attempting to connect to it.&lt;br&gt;
We set up a 30-second timer to wait for SQL Server to start. We then download the latest backup from Azure Blob Storage using the &lt;code&gt;download-latest.sh&lt;/code&gt; script.&lt;br&gt;
If the download is successful, we use the built-in &lt;code&gt;sqlcmd&lt;/code&gt; utility to enable SQL Server authentication. We then wait for 10 seconds to ensure that the&lt;br&gt;
SQL Server has stabilized. We then use the &lt;code&gt;sqlpackage&lt;/code&gt; utility to import the bacpac file into the SQL Server. Finally, we set up a CRON job to run the&lt;br&gt;
&lt;code&gt;reimport-database.sh&lt;/code&gt; script at a frequency specified by the &lt;code&gt;CRON_SCHEDULE&lt;/code&gt; environment variable. We then exit the script with a success code.&lt;/p&gt;

&lt;p&gt;We need to create the &lt;code&gt;download-latest.sh&lt;/code&gt; and &lt;code&gt;reimport-database.sh&lt;/code&gt; scripts that are called in the &lt;code&gt;initialize-database-and-jobs.sh&lt;/code&gt; script.&lt;/p&gt;
&lt;h2&gt;
  
  
  download-latest.sh
&lt;/h2&gt;

&lt;p&gt;Create a new file called &lt;code&gt;download-latest.sh&lt;/code&gt; and add the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="c"&gt;# Description: This script downloads the latest backup from an Azure Storage Account&lt;/span&gt;
&lt;span class="c"&gt;# Usage: bash DownloadLatest.sh &amp;lt;storageAccountName&amp;gt; &amp;lt;storageAccountKey&amp;gt; &amp;lt;containerName&amp;gt; &amp;lt;localPath&amp;gt;&lt;/span&gt;

&lt;span class="nv"&gt;accountName&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$1&lt;/span&gt;
&lt;span class="nv"&gt;accountKey&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$2&lt;/span&gt;
&lt;span class="nv"&gt;containerName&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$3&lt;/span&gt;
&lt;span class="nv"&gt;localPath&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;4&lt;/span&gt;&lt;span class="k"&gt;:-&lt;/span&gt;&lt;span class="s2"&gt;"./backup.bacpac"&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;

&lt;span class="c"&gt;# Get the name of the latest blob&lt;/span&gt;
&lt;span class="nv"&gt;firstBlob&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;az storage blob list &lt;span class="nt"&gt;--account-key&lt;/span&gt; &lt;span class="nv"&gt;$accountKey&lt;/span&gt; &lt;span class="nt"&gt;--account-name&lt;/span&gt; &lt;span class="nv"&gt;$accountName&lt;/span&gt; &lt;span class="nt"&gt;-c&lt;/span&gt; &lt;span class="nv"&gt;$containerName&lt;/span&gt; &lt;span class="nt"&gt;--query&lt;/span&gt; &lt;span class="s2"&gt;"[?properties.lastModified!=null]|[?ends_with(name, '.bacpac')]|[0].name"&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; tsv&lt;span class="si"&gt;)&lt;/span&gt;

&lt;span class="c"&gt;# Check if $firstBlob is not null (i.e., there are blobs found)&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="nt"&gt;-n&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$firstBlob&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
    &lt;/span&gt;az storage blob download &lt;span class="nt"&gt;--account-key&lt;/span&gt; &lt;span class="nv"&gt;$accountKey&lt;/span&gt; &lt;span class="nt"&gt;--account-name&lt;/span&gt; &lt;span class="nv"&gt;$accountName&lt;/span&gt; &lt;span class="nt"&gt;-c&lt;/span&gt; &lt;span class="nv"&gt;$containerName&lt;/span&gt; &lt;span class="nt"&gt;--name&lt;/span&gt; &lt;span class="nv"&gt;$firstBlob&lt;/span&gt; &lt;span class="nt"&gt;--file&lt;/span&gt; &lt;span class="nv"&gt;$localPath&lt;/span&gt; &lt;span class="nt"&gt;--output&lt;/span&gt; none
    &lt;span class="nb"&gt;exit &lt;/span&gt;0
&lt;span class="k"&gt;else
    &lt;/span&gt;&lt;span class="nb"&gt;exit &lt;/span&gt;1
&lt;span class="k"&gt;fi&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Azure CLI lets us write queries to filter the results of the &lt;code&gt;az storage blob list&lt;/code&gt; command. The queries are written in&lt;br&gt;
&lt;a href="https://jmespath.org/" rel="noopener noreferrer"&gt;JMESPath&lt;/a&gt;, which is a query language for JSON. In this case, we are filtering the results to only include blobs that end with the&lt;br&gt;
&lt;code&gt;.bacpac&lt;/code&gt; extension and then selecting the first one as ordered by the &lt;code&gt;lastModified&lt;/code&gt; property. If there are no blobs found, the script exits with a failure code.&lt;br&gt;
If we find a blob, we download it to the local path specified by the &lt;code&gt;localPath&lt;/code&gt; variable.&lt;/p&gt;
&lt;h2&gt;
  
  
  enable-authentication.sql
&lt;/h2&gt;

&lt;p&gt;Create a new file called &lt;code&gt;enable-authentication.sql&lt;/code&gt; and add the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;sp_configure&lt;/span&gt; &lt;span class="s1"&gt;'contained database authentication'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;GO&lt;/span&gt;
&lt;span class="n"&gt;RECONFIGURE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;GO&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This script enables contained database authentication in the SQL Server instance. This is necessary to allow the &lt;code&gt;sa&lt;/code&gt; user to authenticate to the database&lt;br&gt;
when importing the bacpac file.&lt;/p&gt;
&lt;h2&gt;
  
  
  reimport-database.sh
&lt;/h2&gt;

&lt;p&gt;Create a new file called &lt;code&gt;reimport-database.sh&lt;/code&gt; and add the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Downloading bacpac file from Azure Blob Storage"&lt;/span&gt;
bash /sql/download-latest.sh &lt;span class="nv"&gt;$ACCOUNT_NAME&lt;/span&gt; &lt;span class="nv"&gt;$ACCOUNT_KEY&lt;/span&gt; &lt;span class="nv"&gt;$CONTAINER_NAME&lt;/span&gt; /sql/backup.bacpac
&lt;span class="nv"&gt;backupJob&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$?&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$backupJob&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="nt"&gt;-eq&lt;/span&gt; 0 &lt;span class="o"&gt;]&lt;/span&gt;
&lt;span class="k"&gt;then
    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Successfully downloaded bacpac file from Azure Blob Storage!"&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Kill all connections to the database"&lt;/span&gt;
    /opt/mssql-tools/bin/sqlcmd &lt;span class="nt"&gt;-S&lt;/span&gt; localhost &lt;span class="nt"&gt;-U&lt;/span&gt; sa &lt;span class="nt"&gt;-P&lt;/span&gt; &lt;span class="nv"&gt;$MSSQL_SA_PASSWORD&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; master &lt;span class="nt"&gt;-i&lt;/span&gt; /sql/kill-all-connections.sql
    &lt;span class="nv"&gt;databaseName&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$DATABASE_NAME&lt;/span&gt;
    &lt;span class="nv"&gt;existingDatabaseName&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;databaseName&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;_&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +%s&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Renaming existing database to &lt;/span&gt;&lt;span class="nv"&gt;$existingDatabaseName&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    /opt/mssql-tools/bin/sqlcmd &lt;span class="nt"&gt;-S&lt;/span&gt; localhost &lt;span class="nt"&gt;-U&lt;/span&gt; sa &lt;span class="nt"&gt;-P&lt;/span&gt; &lt;span class="nv"&gt;$MSSQL_SA_PASSWORD&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; master &lt;span class="nt"&gt;-Q&lt;/span&gt; &lt;span class="s2"&gt;"ALTER DATABASE &lt;/span&gt;&lt;span class="nv"&gt;$databaseName&lt;/span&gt;&lt;span class="s2"&gt; MODIFY NAME = &lt;/span&gt;&lt;span class="nv"&gt;$existingDatabaseName&lt;/span&gt;&lt;span class="s2"&gt;;"&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Renamed existing database to &lt;/span&gt;&lt;span class="nv"&gt;$existingDatabaseName&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Importing bacpac file into the SQL Server"&lt;/span&gt;
    /sql/sqlpackage/sqlpackage /a:import /sf:/sql/backup.bacpac /tsn:localhost,1433 /tdn:&lt;span class="nv"&gt;$DATABASE_NAME&lt;/span&gt; /tu:sa /tp:&lt;span class="nv"&gt;$MSSQL_SA_PASSWORD&lt;/span&gt; /ttsc:True
&lt;span class="k"&gt;else
    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Failed to download bacpac file from Azure Blob Storage"&lt;/span&gt;
    &lt;span class="nb"&gt;exit &lt;/span&gt;1
&lt;span class="k"&gt;fi&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You'll notice that this script is very similar to the &lt;code&gt;initialize-database-and-jobs.sh&lt;/code&gt; script. The main difference is that we are renaming the existing database&lt;br&gt;
before importing the new bacpac file. This is necessary because the &lt;code&gt;sqlpackage&lt;/code&gt; utility does not support overwriting an existing database.&lt;br&gt;
We also need to kill all connections to the database before renaming it. We do this by running a SQL script that we will create called &lt;code&gt;kill-all-connections.sql&lt;/code&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  kill-all-connections.sql
&lt;/h2&gt;

&lt;p&gt;Create a new file called &lt;code&gt;kill-all-connections.sql&lt;/code&gt; and add the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- kill all connections to the database&lt;/span&gt;
&lt;span class="k"&gt;DECLARE&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;killCommand&lt;/span&gt; &lt;span class="n"&gt;NVARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;killCommand&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;killCommand&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="s1"&gt;'KILL '&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="k"&gt;CAST&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;spid&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="s1"&gt;';'&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;sys&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;sysprocesses&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;dbid&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;EXEC&lt;/span&gt; &lt;span class="n"&gt;sp_executesql&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;killCommand&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As its name suggests, this script kills all connections to the database. This is necessary to ensure that we can rename the database without any active connections.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Directory Structure and Dockerfile Changes
&lt;/h2&gt;

&lt;p&gt;If everything went to plan, your directory should look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.
├── Dockerfile
├── entrypoint.sh
├── initialize-database-and-jobs.sh
├── download-latest.sh
├── enable-authentication.sql
├── reimport-database.sh
├── kill-all-connections.sql
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now before we can build the Docker container, we need to make a few final changes to our &lt;code&gt;Dockerfile&lt;/code&gt;.&lt;br&gt;
We need to copy all the files from our local machine to the container and set the correct permissions on the scripts.&lt;br&gt;
Add the following lines to your &lt;code&gt;Dockerfile&lt;/code&gt; directly after the &lt;code&gt;USER root&lt;/code&gt; line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="k"&gt;RUN &lt;/span&gt;&lt;span class="nb"&gt;mkdir&lt;/span&gt; /home/mssql &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;chown &lt;/span&gt;mssql /home/mssql &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;chmod&lt;/span&gt; +x /sql/initialize-database-and-jobs.sh &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;chmod&lt;/span&gt; +x /sql/entrypoint.sh &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;chmod&lt;/span&gt; +x /sql/download-latest.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;mkdir&lt;/code&gt; command creates a new directory in the container and the &lt;code&gt;chown&lt;/code&gt; command changes the owner of the directory to the &lt;code&gt;mssql&lt;/code&gt; user.&lt;br&gt;
We then set the correct permissions on the scripts using the &lt;code&gt;chmod&lt;/code&gt; command. Save the &lt;code&gt;Dockerfile&lt;/code&gt; and close it.&lt;/p&gt;
&lt;h1&gt;
  
  
  Building the Docker Container
&lt;/h1&gt;

&lt;p&gt;Now that we have all the necessary files, we can build the Docker container. Open a terminal and navigate to the directory where your &lt;code&gt;Dockerfile&lt;/code&gt; is located.&lt;br&gt;
Run the following command to build the container:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker build &lt;span class="nt"&gt;-t&lt;/span&gt; azure-local-database-refresh &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command will build the container using the &lt;code&gt;Dockerfile&lt;/code&gt; in the current directory and tag the container with the name &lt;code&gt;azure-local-database-refresh&lt;/code&gt;.&lt;br&gt;
The build process may take a few minutes to complete. Once it's done, you can run the container using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;ACCOUNT_NAME&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;storageAccountName&amp;gt; &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;ACCOUNT_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;storageAccountKey&amp;gt; &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;CONTAINER_NAME&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;containerName&amp;gt; &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;DATABASE_NAME&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;MyDatabase &lt;span class="nt"&gt;-p&lt;/span&gt; 1433:1433 &lt;span class="nt"&gt;-it&lt;/span&gt;  azure-local-database-refresh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace &lt;code&gt;&amp;lt;storageAccountName&amp;gt;&lt;/code&gt;, &lt;code&gt;&amp;lt;storageAccountKey&amp;gt;&lt;/code&gt;, and &lt;code&gt;&amp;lt;containerName&amp;gt;&lt;/code&gt; with the appropriate values for your Azure Blob Storage account. The &lt;code&gt;-p&lt;/code&gt; flag maps port 1433 of the container to port 1433 of your local machine.&lt;br&gt;
This allows you to connect to the SQL Server instance running in the container from your local machine. The &lt;code&gt;-it&lt;/code&gt; flag runs the container in interactive mode, which allows you to see the output of the container in your terminal.&lt;/p&gt;

&lt;p&gt;You should see the output of the container in your terminal. If everything is working correctly, you should see messages indicating that the bacpac file has been downloaded and imported into the SQL Server instance. The import&lt;br&gt;
process can take a several minutes to complete, depending on the size of the database. Once the import process is complete, you should see a message indicating that the CRON job has been set up successfully.&lt;/p&gt;
&lt;h1&gt;
  
  
  Testing the Container
&lt;/h1&gt;

&lt;p&gt;Now that we have our container up and running, we can test it by connecting to the SQL Server instance and verifying that the database has been restored.&lt;br&gt;
Open a new terminal and run the following command to connect to the SQL Server instance running in the container:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-it&lt;/span&gt; &amp;lt;containerId&amp;gt; /opt/mssql-tools/bin/sqlcmd &lt;span class="nt"&gt;-S&lt;/span&gt; localhost &lt;span class="nt"&gt;-U&lt;/span&gt; sa &lt;span class="nt"&gt;-P&lt;/span&gt; yourStrong&lt;span class="o"&gt;(!)&lt;/span&gt;Password
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace &lt;code&gt;&amp;lt;containerId&amp;gt;&lt;/code&gt; with the ID of your container. This command will open an interactive SQL Server prompt. Run the following command to verify that the database has been restored:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;sys&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;databases&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If everything went to plan you should see the name of your database in the output. You can also run queries against the database to verify that the data has been restored correctly.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;In this article, we created a Docker container that automates the process of downloading the latest backup of a production database from Azure Blob Storage and restoring it to a local SQL Server instance.&lt;br&gt;
We used a combination of bash scripts and Docker to create a portable and easy-to-use solution that can be run on any machine that has Docker installed. This solution can be used to provide teams with access&lt;br&gt;
to the latest production data for development, testing, reporting, data analysis, or other internal uses. The container can be run on a schedule using a CRON job to ensure that the data is always up-to-date. This solution can be&lt;br&gt;
easily extended to support other databases and cloud storage providers.&lt;/p&gt;

&lt;p&gt;The code for this solution is available on &lt;a href="https://github.com/JerrettDavis/az-bacpac-blob-mssql-importer" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;.&lt;br&gt;
This docker container is also available on &lt;a href="https://hub.docker.com/r/jdhproductions/az-bacpac-blob-mssql-importer" rel="noopener noreferrer"&gt;Docker Hub&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>docker</category>
      <category>bash</category>
      <category>azure</category>
    </item>
  </channel>
</rss>
