DEV Community

Chris McKelt
Chris McKelt

Posted on • Originally published at blog.mckelt.com on

10 3

Using Azure Pipelines to restore a production database to another environment

Often we need a fresh copy of the production database in another environment (eg DEV/TEST/UAT).

Previously this was a tedious task involving getting a backup file, copying it to another location, restoring the database. Here is a solution to automate this process using Azure Pipelines.

User Story

Given a production database exists in subscription 1

When we do a release of the Azure Pipeline named ‘Refresh Database – DEV’

Then an copy of production is available in the DEV environment in subscription 2

And permissions are correct for the DEV environment

Pipeline Overview

For each environment that you wish to restore into create an Azure Pipeline with 3 stages.

image

Create variable groups that are scoped to specific stages

Each variable group contains deployment credentials that the specific stage will require to perform operations within the specific Azure Subscription.

image

Task 1 – Export

Create a PowerShell task to run a script and pass it the information for the production environment.

This runs against the production environment and creates a blog storage container that holds the exported BACPAC

image

View code for script production-export.ps1
<#
Export the production database to blog container
#>
[CmdletBinding()]
param (
[string] $tenantid,
[string] $subscriptionid,
[string] $spn_clientid,
[string] $spn_secret
)
Write-Host "************************************************"
Write-Host "variables"
Write-Host "************************************************"
$environment = "PRD";
$backups = 'backups'
$resourcegroup1 = "AUAZE-$environment-demo"
$location1 = "australiaeast"
$sqlserver1 = "AUAZE-$environment-demo-dbsvrp1".ToLower()
$keyvaultname = "au-demo-$environment-1".ToLower()
### PRODUCTION
Write-Host "************************************************"
Write-Host "login via SPN which is locked down to production subscription"
Write-Host "************************************************"
az login --service-principal --username $spn_clientid --password $spn_secret --tenant $tenantid
$sqladmin = az keyvault secret show --name 'sqladmin' --vault-name $keyvaultname --query 'value'
$sqlpassword = az keyvault secret show --name 'sqlpassword' --vault-name $keyvaultname --query 'value'
Write-Host "************************************************"
Write-Host "create blog storage with container called bacpac"
Write-Host "************************************************"
az storage account create -n $backups -g $resourcegroup1 -l $location1 --sku Standard_LRS
$keyvalue = az storage account keys list -g $resourcegroup1 -n $backups --subscription $subscriptionid --query '[0].value' -o json
az storage container create -n bacpac --account-name $backups --account-key $keyvalue --auth-mode key
az storage container policy create --container-name bacpac --name ReadWrite --account-key ""$keyvalue"" --account-name $backups --auth-mode key --permissions rwdl
Write-Host "************************************************"
Write-Host "export bacpac from production to blob container"
Write-Host "************************************************"
$filename = "demo_PRD_$(Get-Date -Format "yyyy-MM-dd").bacpac"
$bloburi = "https://backups.blob.core.windows.net/bacpac/$filename"
az sql db export -s $sqlserver1 -n 'AUAZE-PRD-demo-DB1' -g $resourcegroup1 -p $sqlpassword -u $sqladmin --storage-uri $bloburi --storage-key-type "StorageAccessKey" --storage-key "$keyvalue"

Results in production once this script run should show the database BACPAC export

image

Task 2 – Import

Under the ‘import’ stage create a task that will import the BACPAC from the storage container in the production subscription. This uses both production and the environment credentials.

image

View code for script production-import.ps1
<#
import a database from a blob container in another subsciption
#>
[CmdletBinding()]
param (
[string] $environment, # to environment
[string] $tenantid,
[string] $production_subscriptionid,
[string] $production_spn_clientid,
[string] $production_spn_secret,
[string] $subscriptionid,
[string] $spn_clientid,
[string] $spn_secret,
[string] $skuname
)
If ($environment -eq $null){
$environment = "DEV";
}
Write-Host "************************************************"
Write-Host "variables"
Write-Host "************************************************"
$resourcegroup1 = "AUAZE-$environment-DEMO"
$production_resourcegroup1 = "AUAZE-PRD-DEMO"
$sqlserver1 = "AUAZE-$environment-DEMO-dbsvrp1".ToLower()
$keyvaultname = "au-DEMO-$environment-1".ToLower()
$backups = 'backups'
$filename = "DEMO_PRD_$(Get-Date -Format "yyyy-MM-dd").bacpac" ### YOU CAN ONLY IMPORT TODAYS PRODUCTION EXPORT
$bloburi = "https://backups.blob.core.windows.net/bacpac/$filename"
### PRODUCTION
Write-Host "************************************************"
Write-Host "login via spn to production and get the container key"
Write-Host "************************************************"
az login --service-principal --username $production_spn_clientid --password $production_spn_secret --tenant $tenantid
$keyvalue = az storage account keys list -g $production_resourcegroup1 -n $backups --subscription $production_subscriptionid --query '[0].value' -o json
az logout --username $production_spn_clientid
### DEV or UAT
Write-Host "************************************************"
Write-Host "login to dev or uat"
Write-Host "************************************************"
az login --service-principal --username $spn_clientid --password $spn_secret --tenant $tenantid
$sqladmin = az keyvault secret show --name 'sqladmin' --vault-name $keyvaultname --query 'value'
$sqlpassword = az keyvault secret show --name 'sqlpassword' --vault-name $keyvaultname --query 'value'
Write-Host "************************************************"
Write-Host "import bacpac from production to blob container"
Write-Host "************************************************"
az account set --subscription $subscriptionid
az sql db delete -g $resourcegroup1 -s $sqlserver1 -n "DEMO" --yes
az sql db create -g $resourcegroup1 -s $sqlserver1 -n "DEMO" --service-objective $skuname
az sql db import -s $sqlserver1 -n "DEMO" -g $resourcegroup1 -u $sqladmin -p $sqlpassword --auth-type SQL --storage-uri $bloburi --storage-key-type "StorageAccessKey" --storage-key "$keyvalue"

Task 3 – Sanitise

Create a 3rd task in the ‘Sanitise’ stage.

This will scramble any information you do not want in that environment (eg emails).

Also remove any Production SQL user account and replace them with environment specific

View code for script sanitise
<#
sanitise database
remove any production data or permissions
#>
[CmdletBinding()]
param (
[string] $environment,
[string] $tenantid,
[string] $subscriptionid,
[string] $spn_clientid,
[string] $spn_secret
)
# powershell sql module
if (-not (Get-Command Invoke-Sqlcmd -ErrorAction SilentlyContinue)) {
Write-Error "Unabled to find Invoke-SqlCmd cmdlet"
install-module sqlserver
update-module sqlserver
}
if (-not (Get-Module -Name SqlServer | Where-Object {$_.ExportedCommands.Count -gt 0})) {
Write-Error "The SqlServer module is not loaded"
Import-Module SqlServer -ErrorAction Stop
}
if (-not (Get-Module -ListAvailable | Where-Object Name -eq SqlServer)) {
Write-Error "Can't find the SqlServer module"
install-module sqlserver
update-module sqlserver
Import-Module SqlServer -ErrorAction Stop
}
### LOGIN
az login --service-principal --username $spn_clientid --password $spn_secret --tenant $tenantid
$keyvaultname = "au-key-$environment-1".ToLower()
# get SQL Server admin credentials from key vault
$sqladmin = az keyvault secret show --name 'sqladmin' --vault-name $keyvaultname --query 'value'
$sqlpassword = az keyvault secret show --name 'sqlpassword' --vault-name $keyvaultname --query 'value'
# build sql to clean emails
$sql += "UPDATE contact SET email ='sanitised@test.com'"
$sql += "`n"
$sql += "UPDATE contact SET phone ='0444444444'"
$sql += "`n"
# execute SQL against database
Invoke-SqlCmd -ServerInstance "tcp:auaze-$environment-demo.database.windows.net" -Database "AUAZE-$environment-DEMO" -Username "$sqladmin" -Password $sqlpassword -Query $sql
view raw sanitise.ps1 hosted with ❤ by GitHub

TADA!

Running the pipeline now copies the database to the DEV environment. Typically after this will run a software build which will automatically apply schema changes currently in DEV in the database. Happy restoring!

image

Image of Timescale

Timescale – the developer's data platform for modern apps, built on PostgreSQL

Timescale Cloud is PostgreSQL optimized for speed, scale, and performance. Over 3 million IoT, AI, crypto, and dev tool apps are powered by Timescale. Try it free today! No credit card required.

Try free

Top comments (5)

Collapse
 
mattinatlanta profile image
Matt in Atlanta

Good post. Realize this is an old post but do you have any thoughts on selective restore of a subset of production data into lower environments. Restoring the entire production database into lower environments is too expensive. We also have many production databases that would need to stay in sync in lowers with regard to the subset.

Collapse
 
edufur profile image
edufur

Hi. This looks like it could help me, but when i click on the pic thumbnails I get 404 errors. I think I need a better view of those thumnails. Could you update the post pretty please?

Collapse
 
osanti95 profile image
osantizogt

@chris_mcklet Very good blog friend, I have a question, what do you use in the git _arm templates repository, I don't understand that part much I have some doubts there, thank you.

Collapse
 
chris_mckelt profile image
Chris McKelt

Hi - that folder contained the ARM templates for the original solution. The scripts to action this blog post were stored in a folder under it called 'scripts'. Hope that answers your question?

Collapse
 
aamirabbas1981 profile image
aamirabbas1981

Hi, Chris McKelt, Nice writing! Thanks for the blog.
Do you have a similar script for MySQL, Cosmos, and Mongo?

Sentry image

See why 4M developers consider Sentry, “not bad.”

Fixing code doesn’t have to be the worst part of your day. Learn how Sentry can help.

Learn more