DEV Community

Cover image for Flexible and dynamic flow control of Azure DevOps YAML Pipelines using variables
Eana Hufwe
Eana Hufwe

Posted on • Originally published at on

Flexible and dynamic flow control of Azure DevOps YAML Pipelines using variables

Recently I was working on the release automation at work, and one of the requirements is to gap a specific number of hours between stages, and snap to normal business hours. While having a centralized scheduler won’t be a choice unless I want to flood the run logs, and a pipeline run would mostly be unconfigurable once started due to the constraint of Azure DevOps (AzDO) Pipelines, there are still some trickeries to achieve dynamic flow control within the pipeline. In this article, I’d talk about how I setup the flow control.

Here are the two requirements that we want to address in this article:

  • To leave a gap of an arbitrary duration (computed at runtime) between two stages.
  • To cut off a branch of the pipeline run, whose condition is determined at runtime, without leaving a failure status, while respecting other settings of the run like skipped stages and cancellations.

Both of them can be achieved on the same YAML Pipeline.

Arbitrary gap between stages

While AzDO YAML pipelines does not come with a true delay-between-stages feature similar to the one in Classic Release Pipelines, inserting a delay within in a YAML pipelines Job is rather simple. AzDO Pipeline comes with a Delay task that can wait for up to 60 days. We’ve tested it on our pipeline that it can wait for at least 72 hours without issue.

There are two things need to be taken care of. One is that the Delay task is an agentless task. If most of your tasks run on an agent, like PowerShell or Bash script, you would need to create a separate _agentless job _ for the Delay task. The other is that AzDO Pipelines by default limit a job/task timeout to be 60 minutes. If you expect it to delay for longer than that, you should update the cancelTimeoutInMinutes of both the task and the containing job to the maximum duration you’d expect, or simply put 0 to eliminate the limit.

To set a duration dynamically, we can calculate the duration in a job, and set the result with an output variable. In agent tasks, you can set an output variable by printing a logging command to stdout. You can also invoke the REST API of AzDO Pipelines if you are running a custom agentless task to achieve similar outcome.

Below is an example demonstrating a way to schedule a gap for at least 48 hours, and the gap will always ends on a weekday.

- job: GapScheduler
  displayName: Gap scheduler
  - name: Scheduler
    powershell: |
      $delayDuration = New-TimeSpan -Hours 48
      while (
        (Get-Date).Add($delayDuration).DayOfWeek -eq 'Saturday' -or 
        (Get-Date).Add($delayDuration).DayOfWeek -eq 'Sunday'
      ) {
        $delayDuration = $delayDuration.Add(New-TimeSpan -Hours 24)
      Write-Host "##vso[task.setvariable variable=delayMinutes;isoutput=true]$($delayDuration.TotalMinutes)"
- job: InterStageGap
  displayName: Inter-stage gap
  dependsOn: GapScheduler
  pool: server
  cancelTimeoutInMinutes: 5760 # 4 days
  - task: Delay@1
      delayForMinutes: $[dependencies.GapScheduler.outputs['Scheduler.delayMinutes'] ]

Enter fullscreen mode Exit fullscreen mode

Cut off a pipeline branch

By cutting off a pipeline branch, I mean that when a certain condition is met, all stages depending on it, both directly and indirectly, should be skipped without leaving an error state. Other stages not on the dependency chain shall not be affected. Also, this shall not affect stages that has their dependency disabled at trigger time. It shall also not affect existing behavior when there are legitimate failures.

To decide if a stage should be ran, the condition should be decided before it starts running, that is, in its dependency stage. The decision can be made in any sort of task that can set an output variable. Since the output would be a boolean-like value, and the only type you can set for a variable is string, you can put whatever value you like as the true/false value, here we will use "true" and "false" for simplicity.

To skip a stage based on an output of a previous stage, we need to use the stage condition property. Apart from the existing default condition, we also need to check if the previous stage has told us to skip.

There could be three kinds of “output” from the previous stage decision, "true", "false", and null which could happen if the previous stage is disabled at trigger time. Since we want the stage to run in both the "true" case and the null case, we only need to check if the variable is set to "false".

The default condition for a stage is succeeded(), which has covered all existing conditions that can properly handle failure and disabled cases. It is also tested that succeeded() evaluates to false when the previous stage is skipped by a condition expression that was evaluated to false. In this way, we can safely extend from there with an and() clause.

Below is an example that skips Stage2, and subsequently Stage3 on every Monday.

- stage: Stage1
  # Other jobs go here...
  - job: CutOffDecider
    displayName: Cut-off decider
    - name: Decider
      powershell: |
        $decision = "true"
        if (Get-Date.DayOfWeek -eq "Monday") {
          $decision = "false"
        Write-Host "##vso[task.setvariable variable=shouldRunNextStage;isoutput=true]$decision"        
- stage: Stage2
  dependsOn: Stage1
  condition: and(succeeded(), ne(dependencies.Stage1.outputs['CutOffDecider.Decider.shouldRunNextStage'], 'false'))
  # Jobs go here...
- stage: Stage3
  dependsOn: Stage2
  # Stage 3 is also skipped as a part of the dependency tree when Stage 2 is skipped
  # Jobs go here...

Enter fullscreen mode Exit fullscreen mode

Dynamic gap and cut-off control together forms two of the building blocks of the foundation of a flexible and dynamic multi-stage release automation system that works on Azure DevOps YAML pipelines.

The post Flexible and dynamic flow control of Azure DevOps YAML Pipelines using variables appeared first on 1A23 Blog.

Top comments (0)