25 february 2024
Using of yaml-templates in Azure DevOps for repeatable actions: Part 2 -
Working with output variables in YAML templates in Azure DevOps
Initially, I intended to write an article about the practice of using yaml templates in Azure DevOps to encapsulate repeatable steps in pipelines. But during the writing process, I realized that boring references to documentation and simple examples of yaml files would not be so interesting without tying them to some scenario. Thus, a short, purely technical article turned into a large text dedicated to several topics, each of which is interesting in itself. As a result, I decided to turn this text into a mini-series of articles. Here comes the second part, dedicated to using YAML templates in Azure DevOps, focusing specifically on passing parameters to and from the templates.

In the previous article, I outlined a scenario that I will use as an example to discuss the use of YAML templates. Let me briefly remind you of it.
We have a testing team that has written automated tests of two types:

  • Code tests that run quickly and do not require application deployment.
  • System tests that require both application deployment and the use of special frameworks.
The task of the DevOps team (that's us) is to create a pipeline that would build the application, run code tests, deploy the application to Azure, and run system tests for the deployed application. And most importantly, we strive to make our templates reusable and combinable when creating pipelines for different applications.

In solving this task, I analogize it with programming. Each stage (build, run code tests, deployment, run system tests) essentially represents a function that takes input parameters and produces something as a result.

In simplified terms, we get the following sequence of calls.
Parameters are needed so that we can reuse the stage for different applications, specifying, for example, a repository link or the address of a virtual machine for deploying the application. Thus, we only need to pass common parameters to each of the stages.

Now, just out of curiosity, let's deploy the application on a specially created virtual machine. And in the end, after running all the tests, we will delete this virtual machine. The scheme becomes a little more complex:
And here we already have the first output parameters at the stage of creating a virtual machine. We need the DNS name or IP address to deploy the application on this machine and run system tests. And we get this address dynamically at the moment of creating the virtual machine. So, this will be the output parameter of the "Create VM" function.

Let's complicate the scheme a little more and generate unique names for the virtual machine and the user's password. On the one hand, this will allow us to run several pipelines in parallel without fear of name conflicts, and on the other hand, it will increase security by using a unique password for each new virtual machine.
Thus, we have several output parameters obtained at one stage that we need to pass as input parameters to subsequent stages.

It seems very simple. We created global variables, stored the output parameters there, and then used them when calling the next stages. But it's not that simple. The thing is that the similarity to programming in YAML pipelines in Azure DevOps ends precisely with working with output parameters. Moreover, pipelines, in general, are not exactly programs, as the execution of the sequence of actions depends heavily on the pipeline execution environment.

But let's go through everything step by step. From simple to complex.

First, let's look at the general scheme of how YAML pipelines work in Azure DevOps:

The pipeline can be triggered manually or automatically by some event (trigger). Each pipeline consists of a set of sequentially executed stages, which can run either sequentially or in parallel. Each stage runs on a specific agent (server or virtual machine) in an environment deployed on that agent (operating system and installed frameworks and middleware). Each stage consists of a set of jobs, which can also be run either sequentially or in parallel. And each job consists of a sequence of steps, where each step represents a specific atomic task (for example, copying files or executing a PowerShell script).

For a more detailed description of YAML pipelines, you can refer to this link.

The syntax of a YAML pipeline looks like this:
trigger: none

parameters:
- name: Param1
  displayName: Param number one
  type: number
  default: 1
- name: Param2
  displayName: Param number two
  type: string
  default: Value 1
  values:
  - Value 1
  - Value 2
  - Value 3

pool:
 vmImage: windows-latest

variables:
 - name: Var1
   value: VarValue1
 - name: Var2
   value: VarValue2

stages:
 - stage: StageName1
   displayName: Stage one
   jobs:
     - job: job1
       steps:
       - task: <taskType>
  ...
       - task: <taskType>
  ...
     - job: job2
       steps:
       - task: <taskType>
  ...
       - task: <taskType>
  ...
  - stage: StageName2
   displayName: Stage two
   jobs:
     - job: job1
       steps:
       - task: <taskType>
  ...
       - task: <taskType>
  ...
     - job: job2
       steps:
       - task: <taskType>
  ...
       - task: <taskType>
  ...

Here I provide a very simplified syntax to understand the hierarchy between stage, job, and task. You can delve into the syntax more deeply using the link I provided earlier. Note that the pipeline has parameters. They can be set in a special window when calling the pipeline. Parameters could, for example, be used to pass build parameters to the application.

Azure DevOps allows you to create templates for stages, jobs, and steps. I will package a whole stage into a template, which will serve as our stage (or function, speaking in developer terms).

The syntax for the template with a stage will be as follows:

# File: templates/npm-with-params.yml

parameters:
- name: Param1
displayName: Param number one
type: number
default: 1
- name: Param2
displayName: Param number 2
type: string
default: Value 1
values:
- Value 1
- Value 2
- Value 3

variables:
- name: Var1
value: VarValue1
- name: Var2
value: VarValue2

stages:
- stage: StageName1
displayName: Stage one
jobs:
- job: job1
steps:
- task: <taskType>
...
- task: <taskType>
...
- job: job2
steps:
- task: <taskType>
...
- task: <taskType>

As you can see, it's an exact replica of the pipeline itself. Moreover, there can be multiple stages in the template, but I will only use one stage.

Thus, for each stage in our scheme, we will have a separate template with a separate stage. Let's take a look at the first two stages: build and testing.

For simplicity, let's assume that we have a dotnet application and tests written in NUnit or XUnit as part of the overall solution. Then we can combine the build and testing into one stage:
parameters:
- name: Test
type: boolean
default: true
values:
- true
- false

- name: PublishCodeTestsResult
type: boolean
default: true
values:
- true
- false

- name: Repository
type: string
default: self

- name: Configuration
type: string
default: Release
values:
- Debug
- Release

- name: Runtime
type: string
default: win-x64
values:
- win-x64
- linux-x64

- name: SolutionPath #The path to the .sln file
type: string

- name: SolutionName #The name of .sln file
type: string

- name: StageSuffix
type: string
default:

stages:
- stage: "BuildAndTestStage${{parameters.StageSuffix}}"
displayName: 'Build ${{ parameters.SolutionName }} and run code test'

jobs:
- job: "BuildJob"
displayName: 'Build ${{ parameters.SolutionName }} Job'

steps:
########################################################################################
# Checkout
########################################################################################
- checkout: ${{ parameters.Repository }}
########################################################################################
# Packages restore
# To manage centralized Nuget sources, nuget.config MUST be placed in the ${{ parameters.SolutionPath }} folder
########################################################################################
- task: DotNetCoreCLI@2 # install dependencies
displayName: Restore
inputs:
command: 'restore'
projects: '${{ parameters.SolutionPath }}/${{ parameters.SolutionName }}.sln'
restoreArguments: '-r ${{ parameters.Runtime }}'
feedsToUse: 'config'
nugetConfigPath: '${{ parameters.SolutionPath }}/nuget.config'
verbosityRestore: 'Normal'

########################################################################################
# Build
# --no-restore - because nuggets were restored on the previous step
# --no-self-contained - should be presented because of runtime exist
# -p:ImportByWildcardBeforeSolution=false - allows to set runtime for solution build
########################################################################################
- task: DotNetCoreCLI@2 # build
displayName: Build
inputs:
command: build
arguments: '-c ${{ parameters.Configuration }} -r ${{ parameters.Runtime }} --no-restore --no-self-contained -p:ImportByWildcardBeforeSolution=false'
projects: '${{ parameters.SolutionPath }}/${{ parameters.SolutionName }}.sln'

########################################################################################
# Run code tests
########################################################################################
- task: DotNetCoreCLI@2
condition: eq('${{ parameters.Test }}', 'true')
displayName: Test
inputs:
command: 'test'
projects: '${{ parameters.SolutionPath }}/${{ parameters.SolutionName }}.sln'
arguments: '-c ${{ parameters.Configuration }} -r ${{ parameters.Runtime }} --no-restore'
publishTestResults: ${{ parameters.publishCodeTestsResult }}
testRunTitle: '${{ parameters.SolutionName }}-$(buildConfiguration)-$(buildRuntime)'
In general, within the scope of this article, it's not so important how exactly you will build your application and run tests, as we are discussing the pipeline and templates themselves. And this template demonstrates how we can pass parameters to it. In order to make it universal and to build dotnet applications from different sources, I added several parameters:

  • Repository - link to the repository
  • Configuration - build configuration (Debug/Release, etc.)
  • Runtime - runtime for building (win-x64/linux-x64, etc.)
  • SolutionPath - path to the sln file inside the repository
  • SolutionName - the name of the sln file. It will also be used as the project name when outputting information in the pipeline.

For working with tests, I also added a couple of parameters:

  • Test - whether to run code tests after building
  • PublishCodeTestsResult - whether to publish test results in Azure DevOps.

And a little life hack:

  • StageSuffix - an addition to the stage name that allows using this template several times in one pipeline since the stage name must be unique.

A pipeline with a call to this template might look like this:

resources:
repositories:
- repository: templates
type: git
name: automation-templates
ref: 'main'

parameters:
- name: Configuration
type: string
default: Release
values:
- Debug
- Release

- name: Runtime
type: string
default: win-x64
values:
- win-x64

variables:
- name: WebApi_SolutionPath # The path to the .sln file
value: 'src\'
- name: WebApi_SolutionName # The name of .sln file
value: WebAPI
- name: WebApi_Repo
value: git://OurProject/WebAPI@main
- name: WebApi_StageSuffix
value: "_webapi"

- name: ClientApp_SolutionPath # The path to the .sln file
value: 'src\'
- name: ClientApp_SolutionName # The name of .sln file
value: ClientApp
- name: ClientApp_Repo
value: git://OutProject/ClientApp@main
- name: ClientApp_StageSuffix
value: "_ClientApp"

pool:
vmImage: windows-latest



stages:
#BuildAndTestStage WebApi Stage
- template: Build\netcore-build-template.yaml@templates # Template from the templates repository
parameters:
Configuration: ${{ parameters.Configuration }}
Runtime: ${{ parameters.Runtime }}
SolutionPath: ${{ variables.WebApi_SolutionPath }}
SolutionName: ${{ variables.WebApi_SolutionName }}
Repository: ${{ variables.WebApi_Repo }}
StageSuffix: ${{variables.WebApi_StageSuffix}}

#BuildAndTestStage ClientApp Stage
- template: Build\netcore-build-template.yaml@templates # Template from the templates repository
parameters:
Configuration: ${{ parameters.Configuration }}
Runtime: ${{ parameters.Runtime }}
SolutionPath: ${{ variables.ClientApp_SolutionPath }}
SolutionName: ${{ variables.ClientApp_SolutionName }}
ArtifactsName: ${{ variables.ClientApp_SolutionName }}
Repository: ${{ variables.ClientApp_Repo }}
Test: false
StageSuffix: ${{variables.ClientApp_StageSuffix}}
Here, we build two different projects in one pipeline using the same template. To do this, we use StageSuffix, which makes the stage names unique; otherwise, we would get an error.

Also, note that I'm not passing all parameters to the templates. Therefore, the tests will be run and published automatically since these parameters have default values set to true in the template.

So, we've learned how to create templates and pass parameters to them. It's all quite straightforward here, with no surprises. Similarly, we could create templates for the rest of our stages:

However, there is one important thing: further on, we generate information that should be used as parameters for subsequent stages.
Here we come to the real reason for writing this article. Generating parameters in one stage and passing them on to the next stage turned out to be not the easiest task. And here's why. Because theoretically, each stage can be run on a separate agent, there can be no "shared" memory or global variables that could store values generated at runtime in one stage for use in another stage. However, Microsoft couldn't completely deprive DevOps of this capability due to the sheer banality of the task and found perhaps the only working solution within this architecture.

But before we move on to reviewing this solution, let's delve into the approach to variables in YAML pipelines in general.

There are only 3 types of variables:

  • macro,
  • template expression
  • runtime expression.

Each of them works slightly differently.

Template expression is processed at the time of pipeline compilation. Essentially, it's a copy-paste of the value you defined in a variable of this type in the YAML file where you use this variable. You cannot change such variables after the pipeline has started. The syntax for such variables is ${{ variables.var }}.

But Macro and Runtime expressions are a bit more complicated. They both work at runtime, but Macro is executed before a task executes. Also, runtime expressions are optimized for use in conditions and expressions. Their syntax is $(var) for Macro and $[variables.var] for Runtime expressions.

For the purpose of this article, this information is sufficient, but if you want to delve deeper into working with variables, you can refer to the documentation provided in the link.

Now, let's talk about output variables. You can find all the details here, but we'll discuss transferring variables between stages to implement our idea.

So, to make a variable visible "outside" the stage, there is a special syntax:

Powershell:
Write-Host "##vso[task.setvariable variable=<Var name>;isoutput=true]<Var value>“
Script:
echo "##vso[task.setvariable variable=<Var name>;isOutput=true]<Var value>"

Using this syntax, you can define a variable right at runtime during script execution, which will be visible both within the stage and in other stages.

Moreover, to refer to this variable, there is also a special syntax, depending on where you want to refer from.

If you are using an output variable within the same job, you just need to make a simple reference to the name of the task where this variable was declared:
steps:
- script: echo "##vso[task.setvariable variable=MyVar;isOutput=true]my val"  # this step generates the output variable
  name: ProduceVar  # because we're going to depend on it, we need to name the step
- script: echo $(ProduceVar.MyVar) # this step uses the output variable
In this example, the script is simply a special type of task that executes bash commands. It replaces the longer definition of a task with parameters.

If you are using an output variable in another job, you need to explicitly define the dependency on the job where the variable is declared:
jobs:
- job: A
  steps:
  # assume that MyTask generates an output variable called "MyVar"
  - script: echo "##vso[task.setvariable variable=MyVar;isOutput=true]my val"
    name: ProduceVar  # because we're going to depend on it, we need to name the step
- job: B
  dependsOn: A
  variables:
    # map the output variable from A into this job
    varFromA: $[ dependencies.A.outputs['ProduceVar.MyVar']]
  steps:
  - script: echo $(varFromA) # this step uses the mapped-in variable
A job has a parameter called dependsOn, which establishes a dependency on Job A. Thanks to this dependency, we can now access the output variable of job A using the syntax $[ dependencies.<job name>.outputs['<task name>.<var name>']]. However, if the dependsOn parameter is not set, the reference to the variable will not work. Also, note that it is necessary to make a mapping through variables inside the job: varFromA: $[ dependencies.A.outputs['ProduceVar.MyVar']]. Then, only $(varFromA) should be used. If you try to make a direct reference through $[dependencies] in the script, it won't work.

If you are using an output variable in another stage, you need to explicitly define the dependency on the stage where the variable is declared:
stages:
- stage: One
  jobs:
  - job: A
    steps:
    - script: echo "##vso[task.setvariable variable=MyVar;isOutput=true]my val"  # this step generates the output variable
      name: ProduceVar  # because we're going to depend on it, we need to name the step

- stage: Two
  dependsOn:
  - One
  jobs:
  - job: B
    variables:
      # map the output variable from A into this job
      varFromA: $[ stageDependencies.One.A.outputs['ProduceVar.MyVar'] ]
    steps:
    - script: echo $(varFromA) # this step uses the mapped-in variable

- stage: Three
  dependsOn:
  - One
  - Two
  jobs:
  - job: C
    variables:
      # map the output variable from A into this job
      varFromA: $[ stageDependencies.One.A.outputs['ProduceVar.MyVar'] ]
    steps:
    - script: echo $(varFromA) # this step uses the mapped-in variable

Here everything happens almost the same as for jobs, but now dependsOn needs to be declared at the stage level, and mapping should be done with a slightly more complex syntax where the stage name is added: $[ stageDependencies.<stage name>.<job name>.outputs['<task name>.<var name>']]. Note that instead of the keyword "dependencies", we now have "stageDependencies". Moreover, the variables section can be located at both the job and stage levels.

And the last peculiarity: these variables only work at runtime! You will get empty values if you try to read such variables at compile time, although pipeline verification may pass, and the pipeline may be launched.

Now we can finally move on to the template for generating a unique name and password for the virtual machine.

parameters:
- name: VMName # VM name
  type: string

stages:
- stage: "GenerateVmValues"
  variables:
  - name: VMNameInternal
    value: ${{ lower(parameters.VMName) }}$(Get-Date -Format ssff)
 jobs:
    - job: ValuesGenJob
      steps:
        - task: PowerShell@2
          name: ExportVariables
          inputs:
            targetType: 'inline'
            script: |
               $pwd = [System.Web.Security.Membership]::GeneratePassword(15,2)
               Write-Host "##vso[task.setvariable variable=VmName;isoutput=true]$(VMNameInternal)"
               Write-Host "##vso[task.setvariable variable=VmUsername;isoutput=true]admin"
               Write-Host "##vso[task.setvariable variable=VmPassword;isoutput=true]$pwd"
        - task: PowerShell@2
          name: ValuesOutput
          displayName: 'Values Output'
          inputs:
            targetType: 'inline'
            script: |
                Write-Host "VmName $(ExportVariables.VmName)"
                Write-Host "VmUsername $(ExportVariables.VmUsername)"
                Write-Host "VmPassword $(ExportVariables.VmPassword)"

In this template, we have two tasks. In the first one, we generate parameters and create output variables, and in the second one, we display their values on the screen using the first method of referencing output variables within the same job.

We create a unique name by adding 4 digits of the current seconds and milliseconds to the base name (${{ lower(parameters.VMName) }}$(Get-Date -Format ssff)). And we generate the password using the [System.Web.Security.Membership]::GeneratePassword function.

Calling the template is no different from what we discussed earlier, but it's worth noting how we pass the parameters generated within the template further:

resources:
  repositories:
    - repository: templates
      type: git
      name: automation-templates
      ref: 'main'

variables:
- name: SolutionName
  value: MySolution
- name: VMName
  value: vm-${{ lower(variables.SolutionName) }}

stages:
#GenerateVmValues Stage
  - template: Deploy\generate-values-for-vm.yaml@templates
    parameters:
      VMName:  ${{ VMName }}

#CreateAzureVm Stage
  - template: Deploy\create-azure-vm.yaml@templates
    parameters:
      VMName: $[stageDependencies.GenerateVmValues.ValuesGenJob.outputs['ExportVariables.VmName']]
      UserName: $[stageDependencies.GenerateVmValues.ValuesGenJob.outputs['ExportVariables.VmUsername']]
      Password: $[stageDependencies.GenerateVmValues.ValuesGenJob.outputs['ExportVariables.VmPassword']]
      DependsOn:    
       - GenerateVmValues
In the call to the create-azure-vm.yaml template, we make a DependsOn reference to the stage name in the parameter generation template and pass 3 parameters using the third method of referencing output variables, specifying the full path of the reference:

$[stageDependencies.GenerateVmValues.ValuesGenJob.outputs['ExportVariables.VmName']]

And the last point: how to use the parameters generated above within another template.

parameters:
- name: VMName # VM name
  type: string
- name: UserName
  type: string
- name: Password
  type: string
- name: DependsOn
  type: object
  default:

stages:
- stage: "MyStage" 
  # only include the DependsOn parameter if provided
  ${{ if parameters.DependsOn }}:
    dependsOn: '${{ parameters.DependsOn }}'
  # we MUST use a local variable for VM name because this name can be generated in the previous stage.
  variables:
    - name: VMName
      value: ${{ parameters.VMName }}
    - name: VMUsername
      value: ${{ parameters.UserName }}
    - name: VMPassword
      value: ${{ parameters.Password }}
Here, several aspects need attention. First, passing the DependsOn value through template parameters. This technique allows establishing a reference to a stage from another template.

Also, note that all parameters are redefined in the Variables section at the stage level. If you refer directly to the parameters, it won't work. Only redefining them in the variables section allows the "magic" to work.

Now we can create simple templates without output parameters, as demonstrated by the universal .NET project builder, and create templates with output parameters, using the output parameters of one template in other templates.

We have the foundation, and the first two steps of our plan are in place. In the next article, we will discuss creating a virtual machine and, most importantly, configuring it for remote connection via PowerShell.