DEV Community

Pedro Correia
Pedro Correia

Posted on • Edited on

3

How did I automate the execution of database scripts on Azure Pipelines?

Indroducing

As the title suggests, this post discusses the time when I needed to create a pipeline to automatically deploy all changes to the database schema, including ADD/ALTER TABLES, ADD/ALTER COLUMNS, PROCEDURES, etc.

My Use Case

The use case involves a project built using Azure SQL Server with Dapper and .NET 7, where each client has its instance of the database. Within the .NET project, there's a Database project responsible for versioning all the database changes, and that part is fine. However, the issue arises when deploying a new schema change as it had to be done manually using Scheme Compare (a function provided by Database projects in Visual Studio IDE). This manual process became time-consuming, especially as the number of clients increased, requiring more scheme comparisons and manual control. This is the reason for automating these processes. How do I automate it?

Database_Project
Image description

What solution I apply?

To begin with, I created a new Azure repository specifically to store the Database project, allowing me to subsequently create a dedicated pipeline for it.

Image description

I created a YML pipeline with the following instructions: it builds the database project and saves the artifact in the drop folder. This artifact is then utilized for proper deployment on Azure SQL Database.

trigger:
- main
- stage
- development

pool:
  vmImage: windows-latest

stages:

- stage: 'ContinuousIntegration'
  displayName: 'Continuous Integration'
  jobs:
    - job: BuildPublishArtifacts
      displayName: 'Building & Publish Solution Artifacts'
      steps:
        - task: MSBuild@1
          displayName: 'Building Solution'
          inputs:
            solution: '**/*.sln'
            clean: true
            configuration: 'Release'
            platform: 'Any CPU'

        - task: CopyFiles@1
          displayName: 'Moving Database Artifacts to Drop folder'
          inputs:
            SourceFolder: '$(Agent.BuildDirectory)\s\Database\bin\Release'
            Contents: '**'
            TargetFolder: '$(Build.ArtifactStagingDirectory)\DBArtifact'

        - task: PublishBuildArtifacts@1
          displayName: 'Publishing Solution Artifact'
          inputs:
            PathtoPublish: '$(Build.ArtifactStagingDirectory)'
            ArtifactName: 'drop'
            publishLocation: Container
Enter fullscreen mode Exit fullscreen mode

Finally, the last step I had to take was to create a release, set up the artifact built by the pipeline, and create a task to deploy a DACPAC.

Image description

As you can see, this isn't a guide; instead, I'm sharing a use case that I encountered during my work and explaining why I chose to automate this process.

Have you ever automate that?

Sentry image

See why 4M developers consider Sentry, “not bad.”

Fixing code doesn’t have to be the worst part of your day. Learn how Sentry can help.

Learn more

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay