Professional Documents
Culture Documents
Azure Devops Pipelines Yaml Schema Reference
Azure Devops Pipelines Yaml Schema Reference
Azure Devops Pipelines Yaml Schema Reference
The YAML schema reference for Azure Pipelines is a detailed reference for YAML
pipelines that lists all supported YAML syntax and their available options.
To create a YAML pipeline, start with the pipeline definition. For more information about
building YAML pipelines, see Customize your pipeline.
The YAML schema reference does not cover tasks. For more information about tasks, see
the Azure Pipelines tasks index.
Definitions
pipeline
A pipeline is one or more stages that describe a CI/CD process.
extends
Extends a pipeline using a template.
jobs
Specifies the jobs that make up the work of a stage.
jobs.deployment
A deployment job is a special type of job. It's a collection of steps to run sequentially
against the environment.
jobs.deployment.environment
Target environment name and optionally a resource name to record the deployment
history; format: environment-name.resource-name.
jobs.deployment.strategy
Execution strategy for this deployment.
jobs.deployment.strategy.canary
Canary Deployment strategy.
jobs.deployment.strategy.rolling
Rolling Deployment strategy.
jobs.deployment.strategy.runOnce
RunOnce Deployment strategy.
jobs.job
A job is a collection of steps run by an agent or on a server.
jobs.job.container
Container resource name.
jobs.job.strategy
Execution strategy for this job.
jobs.job.uses
Any resources required by this job that are not already referenced.
jobs.template
A set of jobs defined in a template.
parameters
Specifies the runtime parameters passed to a pipeline.
parameters.parameter
Pipeline template parameters.
pool
Which pool to use for a job of the pipeline.
pool.demands
Demands (for a private pool).
pr
Pull request trigger.
resources
Resources specifies builds, repositories, pipelines, and other resources used by the
pipeline.
resources.builds
List of build resources referenced by the pipeline.
resources.builds.build
A build resource used to reference artifacts from a run.
resources.containers
List of container images.
resources.containers.container
A container resource used to reference a container image.
resources.containers.container.trigger
Specify none to disable, true to trigger on all image tags, or use the full syntax as
described in the following examples.
resources.packages
List of package resources.
resources.packages.package
A package resource used to reference a NuGet or npm GitHub package.
resources.pipelines
List of pipeline resources.
resources.pipelines.pipeline
A pipeline resource.
resources.pipelines.pipeline.trigger
Specify none to disable, true to include all branches, or use the full syntax as described
in the following examples.
resources.pipelines.pipeline.trigger.branches
Branches to include or exclude for triggering a run.
resources.repositories
List of repository resources.
resources.repositories.repository
A repository resource is used to reference an additional repository in your pipeline.
resources.webhooks
List of webhooks.
resources.webhooks.webhook
A webhook resource enables you to integrate your pipeline with an external service to
automate the workflow.
resources.webhooks.webhook.filters
List of trigger filters.
resources.webhooks.webhook.filters.filter
Webhook resource trigger filter.
schedules
The schedules list specifies the scheduled triggers for the pipeline.
schedules.cron
A scheduled trigger specifies a schedule on which branches are built.
stages
Stages are a collection of related jobs.
stages.stage
A stage is a collection of related jobs.
stages.template
You can define a set of stages in one file and use it multiple times in other files.
steps
Steps are a linear sequence of operations that make up a job.
steps.bash
Runs a script in Bash on Windows, macOS, and Linux.
steps.checkout
Configure how the pipeline checks out source code.
steps.download
Downloads artifacts associated with the current run or from another Azure Pipeline that
is associated as a pipeline resource.
steps.downloadBuild
Downloads build artifacts.
steps.getPackage
Downloads a package from a package management feed in Azure Artifacts or Azure
DevOps Server.
steps.powershell
Runs a script using either Windows PowerShell (on Windows) or pwsh (Linux and
macOS).
steps.publish
Publishes (uploads) a file or folder as a pipeline artifact that other jobs and pipelines can
consume.
steps.pwsh
Runs a script in PowerShell Core on Windows, macOS, and Linux.
steps.reviewApp
Downloads creates a resource dynamically under a deploy phase provider.
steps.script
Runs a script using cmd.exe on Windows and Bash on other platforms.
steps.task
Runs a task.
steps.template
Define a set of steps in one file and use it multiple times in another file.
target
Tasks run in an execution context, which is either the agent host or a container.
target.settableVariables
Restrictions on which variables that can be set.
trigger
Continuous integration (push) trigger.
variables
Define variables using name/value pairs.
variables.group
Reference variables from a variable group.
variables.name
Define variables using name and full syntax.
variables.template
Define variables in a template.
Supporting definitions
7 Note
Supporting definitions are not intended for use directly in a pipeline. Supporting
definitions are used only as part of other definitions, and are included here for
reference.
deployHook
Used to run steps that deploy your application.
includeExcludeFilters
Lists of items to include or exclude.
includeExcludeStringFilters
Items to include or exclude.
mountReadOnly
Volumes to mount read-only, the default is all false.
onFailureHook
Used to run steps for rollback actions or clean-up.
onSuccessHook
Used to run steps for rollback actions or clean-up.
onSuccessOrFailureHook
Used to run steps for rollback actions or clean-up.
postRouteTrafficHook
Used to run the steps after the traffic is routed. Typically, these tasks monitor the health
of the updated version for defined interval.
preDeployHook
Used to run steps that initialize resources before application deployment starts.
routeTrafficHook
Used to run steps that serve the traffic to the updated version.
workspace
Workspace options on the agent.
Here are the syntax conventions used in the YAML schema reference.
See also
This reference covers the schema of an Azure Pipelines YAML file. To learn the basics of
YAML, see Learn YAML in Y Minutes . Azure Pipelines doesn't support all YAML
features. Unsupported features include anchors, complex keys, and sets. Also, unlike
standard YAML, Azure Pipelines depends on seeing stage , job , task , or a task shortcut
like script as the first key in a mapping.
pipeline definition
Article • 07/10/2023
Implementations
Implementation Description
Remarks
A pipeline is one or more stages that describe a CI/CD process. Stages are the major
divisions in a pipeline. The stages "Build this app," "Run these tests," and "Deploy to
preproduction" are good examples.
A stage is one or more jobs, which are units of work assignable to the same machine.
You can arrange both stages and jobs into dependency graphs. Examples include "Run
this stage before that one" and "This job depends on the output of that job."
A job is a linear series of steps. Steps can be tasks, scripts, or references to external
templates.
Pipeline
Stage A
Job 1
Step 1.1
Step 1.2
...
Job 2
Step 2.1
Step 2.2
...
Stage B
...
Simple pipelines don't require all of these levels. For example, in a single-job build, you
can omit the containers for stages and jobs because there are only steps. And because
many options shown in this article aren't required and have good defaults, your YAML
definitions are unlikely to include all of them.
If you have a single stage, you can omit the stages keyword and directly specify the
jobs keyword:
YAML
If you have a single stage and a single job, you can omit the stages and jobs keywords
and directly specify the steps keyword:
YAML
Use the name property to configure the pipeline run number. For more information, see
Configure run or build numbers.
pipeline: stages
Pipeline with stages.
YAML
stages: [ stage | template ] # Required. Stages are groups of jobs that can
run without human intervention.
pool: string | pool # Pool where jobs in this pipeline will run unless
otherwise specified.
name: string # Pipeline run number.
appendCommitMessageToRunName: boolean # Append the commit message to the
build number. The default is true.
trigger: none | trigger | [ string ] # Continuous integration triggers.
parameters: [ parameter ] # Pipeline template parameters.
pr: none | pr | [ string ] # Pull request triggers.
schedules: [ cron ] # Scheduled triggers.
resources: # Containers and repositories used in the build.
builds: [ build ] # List of build resources referenced by the pipeline.
containers: [ container ] # List of container images.
pipelines: [ pipeline ] # List of pipeline resources.
repositories: [ repository ] # List of repository resources.
webhooks: [ webhook ] # List of webhooks.
packages: [ package ] # List of package resources.
variables: variables | [ variable ] # Variables for this pipeline.
lockBehavior: string # Behavior lock requests from this stage should exhibit
in relation to other exclusive lock requests.
Properties
stages stages. Required.
Stages are groups of jobs that can run without human intervention.
pool pool.
Pool where jobs in this pipeline will run unless otherwise specified.
name string.
appendCommitMessageToRunName boolean.
Append the commit message to the build number. The default is true.
trigger trigger.
parameters parameters.
pr pr.
Pull request triggers.
schedules schedules.
Scheduled triggers.
resources resources.
variables variables.
lockBehavior string.
Behavior lock requests from this stage should exhibit in relation to other exclusive lock
requests. sequential | runLatest.
Examples
YAML
trigger:
- main
pool:
vmImage: ubuntu-latest
stages:
- stage: CI
jobs:
- job: CIWork
steps:
- script: "Do CI work"
- stage: Test
jobs:
- job: TestWork
steps:
- script: "Do test work"
pipeline: extends
Pipeline that extends a template.
YAML
Properties
extends extends. Required.
Extends a template.
pool pool.
Pool where jobs in this pipeline will run unless otherwise specified.
name string.
appendCommitMessageToRunName boolean.
Append the commit message to the build number. The default is true.
trigger trigger.
parameters parameters.
pr pr.
schedules schedules.
Scheduled triggers.
resources resources.
variables variables.
lockBehavior string.
Behavior lock requests from this stage should exhibit in relation to other exclusive lock
requests. sequential | runLatest.
pipeline: jobs
Pipeline with jobs and one implicit stage.
YAML
Properties
jobs jobs. Required.
Jobs represent units of work which can be assigned to a single agent or server.
pool pool.
Pool where jobs in this pipeline will run unless otherwise specified.
name string.
appendCommitMessageToRunName boolean.
Append the commit message to the build number. The default is true.
trigger trigger.
parameters parameters.
pr pr.
Scheduled triggers.
resources resources.
variables variables.
lockBehavior string.
Behavior lock requests from this stage should exhibit in relation to other exclusive lock
requests. sequential | runLatest.
Examples
YAML
trigger:
- main
pool:
vmImage: ubuntu-latest
jobs:
- job: PreWork
steps:
- script: "Do pre-work"
- job: PostWork
pool: windows-latest
steps:
- script: "Do post-work using a different hosted image"
pipeline: steps
Pipeline with steps and one implicit job.
YAML
Properties
steps steps. Required.
strategy jobs.job.strategy.
continueOnError string.
pool pool.
Pool where jobs in this pipeline will run unless otherwise specified.
container jobs.job.container.
workspace workspace.
appendCommitMessageToRunName boolean.
Append the commit message to the build number. The default is true.
trigger trigger.
parameters parameters.
pr pr.
schedules schedules.
Scheduled triggers.
resources resources.
variables variables.
lockBehavior string.
Behavior lock requests from this stage should exhibit in relation to other exclusive lock
requests. sequential | runLatest.
Examples
YAML
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: "Hello world!"
See also
Pipelines with multiple jobs
Triggers
Variables
Build number formats
extends definition
Article • 07/10/2023
YAML
extends:
template: string # The template referenced by the pipeline to extend.
parameters: # Parameters used in the extend.
Properties
template string.
Examples
Templates and their parameters are turned into constants before the pipeline runs.
Template parameters provide type safety to input parameters. In this example, templates
restrict which pools can be used in a pipeline by offering an enumeration of possible
options rather than a freeform string.
YAML
# template.yml
parameters:
- name: userpool
type: string
default: Azure Pipelines
values:
- Azure Pipelines
- private-pool-1
- private-pool-2
# azure-pipelines.yml
extends:
template: template.yml
parameters:
userpool: private-pool-1
See also
Template types & usage
Security through templates
jobs definition
Article • 07/10/2023
YAML
jobs: [ job | deployment | template ] # Specifies the jobs that make up the
work of a stage.
List types
Type Description
jobs.deployment A deployment job is a special type of job. It's a collection of steps to run
sequentially against the environment.
Remarks
A job is a collection of steps run by an agent or on a server. Jobs can run conditionally
and might depend on earlier jobs.
7 Note
If you have only one stage and one job, you can use single-job syntax as a shorter
way to describe the steps to run.
Examples
YAML
jobs:
- job: MyJob
displayName: My First Job
continueOnError: true
workspace:
clean: outputs
steps:
- script: echo My first job
See also
For more information about uses , see Limit job authorization scope to referenced
Azure DevOps repositories. For more information about workspaces, including
clean options, see the workspace topic in Jobs.
Learn more about variables, steps, pools, and server jobs.
jobs.deployment definition
Article • 07/10/2023
A deployment job is a special type of job. It's a collection of steps to run sequentially
against the environment.
YAML
jobs:
- deployment: string # Required as first property. Name of the deployment
job, A-Z, a-z, 0-9, and underscore. The word deploy is a keyword and is
unsupported as the deployment name.
displayName: string # Human-readable name for the deployment.
dependsOn: string | [ string ] # Any jobs which must complete before this
one.
condition: string # Evaluate this condition expression to determine
whether to run this deployment.
continueOnError: string # Continue running even on failure?
timeoutInMinutes: string # Time to wait for this job to complete before
the server kills it.
cancelTimeoutInMinutes: string # Time to wait for the job to cancel before
forcibly terminating it.
variables: variables | [ variable ] # Deployment-specific variables.
pool: string | pool # Pool where this job will run.
environment: string | environment # Target environment name and optionally
a resource name to record the deployment history; format: environment-
name.resource-name.
strategy: strategy # Execution strategy for this deployment.
workspace: # Workspace options on the agent.
clean: string # Which parts of the workspace should be scorched before
fetching.
uses: # Any resources required by this job that are not already
referenced.
repositories: [ string ] # Repository references.
pools: [ string ] # Pool references.
container: string | container # Container resource name.
services: # Container resources to run as a service container.
string: string # Name/value pairs
templateContext: # Deployment related information passed from a pipeline
when extending a template.
Properties
deployment string. Required as first property.
Name of the deployment job, A-Z, a-z, 0-9, and underscore. The word deploy is a
keyword and is unsupported as the deployment name.
displayName string.
condition string.
continueOnError string.
timeoutInMinutes string.
Time to wait for this job to complete before the server kills it.
cancelTimeoutInMinutes string.
Time to wait for the job to cancel before forcibly terminating it.
variables variables.
Deployment-specific variables.
pool pool.
environment jobs.deployment.environment.
Target environment name and optionally a resource name to record the deployment
history; format: environment-name.resource-name.
strategy jobs.deployment.strategy.
Execution strategy for this deployment.
workspace workspace.
uses jobs.job.uses.
Any resources required by this job that are not already referenced.
container jobs.job.container.
Deployment related information passed from a pipeline when extending a template. See
remarks for more information. For more information about templateContext , see
Extended YAML Pipelines templates can now be passed context information for stages,
jobs, and deployments and Templates - Use templateContext to pass properties to
templates.
Remarks
In YAML pipelines, the pipelines team recommends that you put your deployment steps
in a deployment job.
For more information about templateContext , see Extended YAML Pipelines templates
can now be passed context information for stages, jobs, and deployments and
Templates - Use templateContext to pass properties to templates.
Examples
YAML
jobs:
# track deployments on the environment
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: ubuntu-latest
# creates an environment if it doesn't exist
environment: 'smarthotel-dev'
strategy:
# default deployment strategy, more coming...
runOnce:
deploy:
steps:
- script: echo my first deployment
jobs.deployment.environment definition
Article • 07/10/2023
The environment keyword specifies the environment or its resource that is targeted by a
deployment job of the pipeline.
Implementations
Implementation Description
Remarks
An environment also holds information about the deployment strategy for running the
steps defined inside the job.
You can reduce the deployment target's scope to a particular resource within the
environment as shown here:
YAML
environment: 'smarthotel-dev.bookings'
strategy:
runOnce:
deploy:
steps:
- task: KubernetesManifest@0
displayName: Deploy to Kubernetes cluster
inputs:
action: deploy
namespace: $(k8sNamespace)
manifests: $(System.ArtifactsDirectory)/manifests/*
imagePullSecrets: $(imagePullSecret)
containers: $(containerRegistry)/$(imageRepository):$(tag)
# value for kubernetesServiceConnection input automatically passed
down to task by environment.resource input
environment: string
To specify an environment by name without using any additional properties, use the
following syntax.
YAML
environment string.
Examples
YAML
environment: environmentName.resourceName
strategy: # deployment strategy
runOnce: # default strategy
deploy:
steps:
- script: echo Hello world
YAML
environment:
name: string # Name of environment.
resourceName: string # Name of resource.
resourceId: string # Id of resource.
resourceType: string # Type of environment resource.
tags: string # List of tag filters.
Properties
name string.
Name of environment.
resourceName string.
Name of resource.
resourceId string.
Id of resource.
resourceType string.
tags string.
Examples
The full syntax is:
YAML
If you specify an environment or one of its resources but don't need to specify other
properties, you can shorten the syntax to:
YAML
environment: environmentName.resourceName
strategy: # deployment strategy
runOnce: # default strategy
deploy:
steps:
- script: echo Hello world
You can reduce the deployment target's scope to a particular resource within the
environment as shown here:
YAML
environment: 'smarthotel-dev.bookings'
strategy:
runOnce:
deploy:
steps:
- task: KubernetesManifest@0
displayName: Deploy to Kubernetes cluster
inputs:
action: deploy
namespace: $(k8sNamespace)
manifests: $(System.ArtifactsDirectory)/manifests/*
imagePullSecrets: $(imagePullSecret)
containers: $(containerRegistry)/$(imageRepository):$(tag)
# value for kubernetesServiceConnection input automatically passed
down to task by environment.resource input
See also
Create and target an environment
jobs.deployment.strategy definition
Article • 07/10/2023
Implementations
Implementation Description
Remarks
When you're deploying application updates, it's important that the technique you use to
deliver the update will:
Enable initialization.
Deploy the update.
Route traffic to the updated version.
Test the updated version after routing traffic.
In case of failure, run steps to restore to the last known good version.
We achieve this by using lifecycle hooks that can run steps during deployment. Each of
the lifecycle hooks resolves into an agent job or a server job (or a container or validation
job in the future), depending on the pool attribute. By default, the lifecycle hooks will
inherit the pool specified by the deployment job.
If you are using self-hosted agents, you can use the workspace clean options to clean
your deployment workspace.
YAML
jobs:
- deployment: deploy
pool:
vmImage: ubuntu-latest
workspace:
clean: all
environment: staging
strategy: runOnce
The runOnce deployment strategy rolls out changes by executing each of its steps one
time.
YAML
strategy:
runOnce: # RunOnce Deployment strategy.
preDeploy: # Pre deploy hook for runOnce deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where pre deploy steps will run.
deploy: # Deploy hook for runOnce deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where deploy steps will run.
routeTraffic: # Route traffic hook for runOnce deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where route traffic steps will run.
postRouteTraffic: # Post route traffic hook for runOnce deployment
strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post route traffic steps will run.
on: # On success or failure hook for runOnce deployment strategy.
failure: # Runs on failure of any step.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post on failure steps will run.
success: # Runs on success of all of the steps.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where on success steps will run.
Properties
runOnce jobs.deployment.strategy.runOnce.
Remarks
runOnce is the simplest deployment strategy wherein all the lifecycle hooks, namely
preDeploy deploy , routeTraffic , and postRouteTraffic , are executed once. Then, either
strategy: rolling
A rolling deployment replaces instances of the previous version of an application with
instances of the new version of the application on a fixed set of virtual machines (rolling
set) in each iteration.
YAML
strategy:
rolling: # Rolling Deployment strategy.
maxParallel: string # Maximum number of jobs running in parallel.
preDeploy: # Pre deploy hook for rolling deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where pre deploy steps will run.
deploy: # Deploy hook for rolling deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where deploy steps will run.
routeTraffic: # Route traffic hook for rolling deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where route traffic steps will run.
postRouteTraffic: # Post route traffic hook for rolling deployment
strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post route traffic steps will run.
on: # On success or failure hook for rolling deployment strategy.
failure: # Runs on failure of any step.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post on failure steps will run.
success: # Runs on success of all of the steps.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where on success steps will run.
Properties
rolling jobs.deployment.strategy.rolling.
strategy: canary
Canary deployment strategy rolls out changes to a small subset of servers.
YAML
strategy:
canary: # Canary Deployment strategy.
increments: [ string ] # Maximum batch size for deployment.
preDeploy: # Pre deploy hook for canary deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where pre deploy steps will run.
deploy: # Deploy hook for canary deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where deploy steps will run.
routeTraffic: # Route traffic hook for canary deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where route traffic steps will run.
postRouteTraffic: # Post route traffic hook for canary deployment
strategy.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post route traffic steps will run.
on: # On success or failure hook for canary deployment strategy.
failure: # Runs on failure of any step.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post on failure steps will run.
success: # Runs on success of all of the steps.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where on success steps will run.
Properties
canary jobs.deployment.strategy.canary.
Canary Deployment strategy.
Remarks
Canary deployment strategy is an advanced deployment strategy that helps mitigate the
risk involved in rolling out new versions of applications. By using this strategy, you can
roll out the changes to a small subset of servers first. As you gain more confidence in
the new version, you can release it to more servers in your infrastructure and route more
traffic to it.
See also
Deployment jobs
jobs.deployment.strategy.canary
definition
Article • 07/10/2023
YAML
canary:
increments: [ string ] # Maximum batch size for deployment.
preDeploy: # Pre deploy hook for canary deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where pre deploy steps will run.
deploy: # Deploy hook for canary deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where deploy steps will run.
routeTraffic: # Route traffic hook for canary deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where route traffic steps will run.
postRouteTraffic: # Post route traffic hook for canary deployment
strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where post route traffic steps will run.
on: # On success or failure hook for canary deployment strategy.
failure: # Runs on failure of any step.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post on failure steps will run.
success: # Runs on success of all of the steps.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where on success steps will run.
Properties
increments string list.
preDeploy preDeployHook.
deploy deployHook.
routeTraffic routeTrafficHook.
postRouteTraffic postRouteTrafficHook.
on onSuccessOrFailureHook.
Remarks
Canary deployment strategy is an advanced deployment strategy that helps mitigate the
risk involved in rolling out new versions of applications. By using this strategy, you can
roll out the changes to a small subset of servers first. As you gain more confidence in
the new version, you can release it to more servers in your infrastructure and route more
traffic to it.
Canary deployment strategy supports the preDeploy lifecycle hook (executed once) and
iterates with the deploy , routeTraffic , and postRouteTraffic lifecycle hooks. It then
exits with either the success or failure hook.
starts.
deploy : Used to run steps that deploy your application. Download artifact task will be
auto injected only in the deploy hook for deployment jobs. To stop downloading
artifacts, use - download: none or choose specific artifacts to download by specifying
Download Pipeline Artifact task.
routeTraffic : Used to run steps that serve the traffic to the updated version.
postRouteTraffic : Used to run the steps after the traffic is routed. Typically, these tasks
on: failure or on: success : Used to run steps for rollback actions or clean-up.
Examples
In the following example, the canary strategy for AKS will first deploy the changes with
10 percent pods, followed by 20 percent, while monitoring the health during
postRouteTraffic . If all goes well, it will promote to 100 percent.
YAML
jobs:
- deployment:
environment: smarthotel-dev.bookings
pool:
name: smarthotel-devPool
strategy:
canary:
increments: [10,20]
preDeploy:
steps:
- script: initialize, cleanup....
deploy:
steps:
- script: echo deploy updates...
- task: KubernetesManifest@0
inputs:
action: $(strategy.action)
namespace: 'default'
strategy: $(strategy.name)
percentage: $(strategy.increment)
manifests: 'manifest.yml'
postRouteTraffic:
pool: server
steps:
- script: echo monitor application health...
on:
failure:
steps:
- script: echo clean-up, rollback...
success:
steps:
- script: echo checks passed, notify...
See also
Deployment jobs
jobs.deployment.strategy.rolling
definition
Article • 07/10/2023
YAML
rolling:
maxParallel: string # Maximum number of jobs running in parallel.
preDeploy: # Pre deploy hook for rolling deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where pre deploy steps will run.
deploy: # Deploy hook for rolling deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where deploy steps will run.
routeTraffic: # Route traffic hook for rolling deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where route traffic steps will run.
postRouteTraffic: # Post route traffic hook for rolling deployment
strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where post route traffic steps will run.
on: # On success or failure hook for rolling deployment strategy.
failure: # Runs on failure of any step.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post on failure steps will run.
success: # Runs on success of all of the steps.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where on success steps will run.
preDeploy preDeployHook.
deploy deployHook.
Deploy hook for rolling deployment strategy.
routeTraffic routeTrafficHook.
postRouteTraffic postRouteTrafficHook.
on onSuccessOrFailureHook.
Remarks
Azure Pipelines currently only support the rolling strategy to VM resources.
For example, a rolling deployment typically waits for deployments on each set of virtual
machines to complete before proceeding to the next set of deployments. You could do
a health check after each iteration and if a significant issue occurs, the rolling
deployment can be stopped.
Rolling deployments can be configured by specifying the keyword rolling: under the
strategy: node. The strategy.name variable is available in this strategy block, which
All the lifecycle hooks are supported and lifecycle hook jobs are created to run on each
VM.
preDeploy , deploy , routeTraffic , and postRouteTraffic are executed once per batch
size defined by maxParallel . Then, either on: success or on: failure is executed.
With maxParallel: <# or % of VMs> , you can control the number/percentage of virtual
machine targets to deploy to in parallel. This ensures that the app is running on these
machines and is capable of handling requests while the deployment is taking place on
the rest of the machines, which reduces overall downtime.
7 Note
There are a few known gaps in this feature. For example, when you retry a stage, it
will re-run the deployment on all VMs not just failed targets.
starts.
deploy : Used to run steps that deploy your application. Download artifact task will be
auto injected only in the deploy hook for deployment jobs. To stop downloading
artifacts, use - download: none or choose specific artifacts to download by specifying
Download Pipeline Artifact task.
routeTraffic : Used to run steps that serve the traffic to the updated version.
postRouteTraffic : Used to run the steps after the traffic is routed. Typically, these tasks
on: failure or on: success : Used to run steps for rollback actions or clean-up.
Examples
The following rolling strategy example for VMs updates up to five targets in each
iteration. maxParallel will determine the number of targets that can be deployed to, in
parallel. The selection accounts for absolute number or percentage of targets that must
remain available at any time excluding the targets that are being deployed to. It is also
used to determine the success and failure conditions during deployment.
YAML
jobs:
- deployment: VMDeploy
displayName: web
environment:
name: smarthotel-dev
resourceType: VirtualMachine
strategy:
rolling:
maxParallel: 5 #for percentages, mention as x%
preDeploy:
steps:
- download: current
artifact: drop
- script: echo initialize, cleanup, backup, install certs
deploy:
steps:
- task: IISWebAppDeploymentOnMachineGroup@0
displayName: 'Deploy application to Website'
inputs:
WebSiteName: 'Default Web Site'
Package: '$(Pipeline.Workspace)/drop/**/*.zip'
routeTraffic:
steps:
- script: echo routing traffic
postRouteTraffic:
steps:
- script: echo health check post-route traffic
on:
failure:
steps:
- script: echo Restore from backup! This is on failure
success:
steps:
- script: echo Notify! This is on success
See also
Deployment jobs
jobs.deployment.strategy.runOnce
definition
Article • 07/10/2023
The runOnce deployment strategy rolls out changes by executing each of its steps one
time.
YAML
runOnce:
preDeploy: # Pre deploy hook for runOnce deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where pre deploy steps will run.
deploy: # Deploy hook for runOnce deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where deploy steps will run.
routeTraffic: # Route traffic hook for runOnce deployment strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where route traffic steps will run.
postRouteTraffic: # Post route traffic hook for runOnce deployment
strategy.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where post route traffic steps will run.
on: # On success or failure hook for runOnce deployment strategy.
failure: # Runs on failure of any step.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where post on failure steps will run.
success: # Runs on success of all of the steps.
steps: [ task | script | powershell | pwsh | bash | checkout |
download | downloadBuild | getPackage | publish | template | reviewApp ] # A
list of steps to run.
pool: string | pool # Pool where on success steps will run.
Properties
preDeploy preDeployHook.
deploy deployHook.
routeTraffic routeTrafficHook.
postRouteTraffic postRouteTrafficHook.
on onSuccessOrFailureHook.
Remarks
runOnce is the simplest deployment strategy wherein all the lifecycle hooks, namely
preDeploy deploy , routeTraffic , and postRouteTraffic , are executed once. Then, either
on: success or on: failure is executed.
starts.
deploy : Used to run steps that deploy your application. Download artifact task will be
auto injected only in the deploy hook for deployment jobs. To stop downloading
artifacts, use - download: none or choose specific artifacts to download by specifying
Download Pipeline Artifact task.
routeTraffic : Used to run steps that serve the traffic to the updated version.
postRouteTraffic : Used to run the steps after the traffic is routed. Typically, these tasks
on: failure or on: success : Used to run steps for rollback actions or clean-up.
Examples
The following example YAML snippet showcases a simple use of a deployment job by
using the runOnce deployment strategy. The example includes a checkout step.
YAML
jobs:
# Track deployments on the environment.
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: ubuntu-latest
# Creates an environment if it doesn't exist.
environment: 'smarthotel-dev'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- script: echo my first deployment
With each run of this job, deployment history is recorded against the smarthotel-dev
environment.
7 Note
It's also possible to create an environment with empty resources and use that
as an abstract shell to record deployment history, as shown in the previous
example.
The next example demonstrates how a pipeline can refer both an environment and a
resource to be used as the target for a deployment job.
YAML
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: ubuntu-latest
# Records deployment against bookings resource - Kubernetes namespace.
environment: 'smarthotel-dev.bookings'
strategy:
runOnce:
deploy:
steps:
# No need to explicitly pass the connection details.
- task: KubernetesManifest@0
displayName: Deploy to Kubernetes cluster
inputs:
action: deploy
namespace: $(k8sNamespace)
manifests: |
$(System.ArtifactsDirectory)/manifests/*
imagePullSecrets: |
$(imagePullSecret)
containers: |
$(containerRegistry)/$(imageRepository):$(tag)
See also
Deployment jobs
jobs.job definition
Article • 07/10/2023
YAML
jobs:
- job: string # Required as first property. ID of the job.
displayName: string # Human-readable name for the job.
dependsOn: string | [ string ] # Any jobs which must complete before this
one.
condition: string # Evaluate this condition expression to determine
whether to run this job.
continueOnError: string # Continue running even on failure?
timeoutInMinutes: string # Time to wait for this job to complete before
the server kills it.
cancelTimeoutInMinutes: string # Time to wait for the job to cancel before
forcibly terminating it.
variables: variables | [ variable ] # Job-specific variables.
strategy: strategy # Execution strategy for this job.
pool: string | pool # Pool where this job will run.
container: string | container # Container resource name.
services: # Container resources to run as a service container.
string: string # Name/value pairs
workspace: # Workspace options on the agent.
clean: string # Which parts of the workspace should be scorched before
fetching.
uses: # Any resources required by this job that are not already
referenced.
repositories: [ string ] # Repository references.
pools: [ string ] # Pool references.
steps: [ task | script | powershell | pwsh | bash | checkout | download |
downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
templateContext: # Job related information passed from a pipeline when
extending a template.
Properties
job string. Required as first property.
ID of the job. Acceptable values: Valid names may only contain alphanumeric characters
and '_' and may not start with a number.
displayName string.
condition string.
continueOnError string.
timeoutInMinutes string.
Time to wait for this job to complete before the server kills it.
cancelTimeoutInMinutes string.
Time to wait for the job to cancel before forcibly terminating it.
variables variables.
Job-specific variables.
strategy jobs.job.strategy.
pool pool.
container jobs.job.container.
workspace workspace.
Workspace options on the agent. For more information about workspaces, including
clean options, see the workspace topic in Jobs.
uses jobs.job.uses.
Any resources required by this job that are not already referenced. For more information
about uses , see Limit job authorization scope to referenced Azure DevOps repositories.
steps steps.
Job related information passed from a pipeline when extending a template. See remarks
for more information. For more information about templateContext , see Extended YAML
Pipelines templates can now be passed context information for stages, jobs, and
deployments and Templates - Use templateContext to pass properties to templates.
Remarks
The default timeoutInMinutes is set to 60 minutes. For more information, see Timeouts.
7 Note
If you have only one stage and one job, you can use single-job syntax as a shorter
way to describe the steps to run.
For more information about templateContext , see Extended YAML Pipelines templates
can now be passed context information for stages, jobs, and deployments and
Templates - Use templateContext to pass properties to templates.
Examples
YAML
jobs:
- job: MyJob
displayName: My First Job
continueOnError: true
workspace:
clean: outputs
steps:
- script: echo My first job
See also
For more information about uses , see Limit job authorization scope to referenced
Azure DevOps repositories.
For more information about workspaces, including clean options, see the
workspace topic in Jobs.
Learn more about variables, steps, pools, and server jobs.
jobs.job.container definition
Article • 07/10/2023
Container jobs allow you to run jobs on a container instead of the agent host.
Implementations
Implementation Description
container: image Specify job container using image tag and options.
container: string
Specify job container by alias.
YAML
container string.
Remarks
The alias can be the name of an image, or it can be a reference to a container resource.
Examples
The following example fetches the ubuntu image tagged 18.04 from Docker Hub and
then starts the container. When the printenv command runs, it happens inside the
ubuntu:18.04 container.
YAML
pool:
vmImage: 'ubuntu-18.04'
container: ubuntu:18.04
steps:
- script: printenv
container: image
Specify job container using image tag and options.
YAML
container:
image: string # Required. Container image tag.
endpoint: string # ID of the service endpoint connecting to a private
container registry.
env: # Variables to map into the container's environment.
string: string # Name/value pairs
mapDockerSocket: boolean # Set this flag to false to force the agent not
to setup the /var/run/docker.sock volume on container jobs.
options: string # Options to pass into container host.
ports: [ string ] # Ports to expose on the container.
volumes: [ string ] # Volumes to mount on the container.
mountReadOnly: # Volumes to mount read-only, the default is all false.
work: boolean # Mount the work directory as readonly.
externals: boolean # Mount the externals directory as readonly.
tools: boolean # Mount the tools directory as readonly.
tasks: boolean # Mount the tasks directory as readonly.
Properties
image string. Required.
endpoint string.
ID of the service endpoint connecting to a private container registry.
mapDockerSocket boolean.
Set this flag to false to force the agent not to setup the /var/run/docker.sock volume on
container jobs.
options string.
mountReadOnly mountReadOnly.
Examples
Use options to configure container startup.
YAML
container:
image: ubuntu:18.04
options: --hostname container-test --ip 192.168.0.1
steps:
- script: echo hello
In the following example, the containers are defined in the resources section. Each
container is then referenced later, by referring to its assigned alias.
YAML
resources:
containers:
- container: u14
image: ubuntu:14.04
- container: u16
image: ubuntu:16.04
- container: u18
image: ubuntu:18.04
jobs:
- job: RunInContainer
pool:
vmImage: 'ubuntu-18.04'
strategy:
matrix:
ubuntu14:
containerResource: u14
ubuntu16:
containerResource: u16
ubuntu18:
containerResource: u18
container: $[ variables['containerResource'] ]
steps:
- script: printenv
See also
Define container jobs
Define resources
jobs.job.strategy definition
Article • 07/10/2023
Implementations
Implementation Description
YAML
strategy:
matrix: # Matrix defining the job strategy; see the following examples.
{ string1: { string2: string3 }
maxParallel: string # Maximum number of jobs running in parallel.
Properties
matrix { string1: { string2: string3 }.
maxParallel string.
Remarks
YAML
strategy:
matrix: { string1: { string2: string3 } }
maxParallel: number
For each occurrence of string1 in the matrix, a copy of the job is generated. The name
string1 is the copy's name and is appended to the name of the job. For each occurrence
of string2, a variable called string2 with the value string3 is available to the job.
7 Note
Matrix configuration names must contain only basic Latin alphabet letters (A-Z and
a-z), digits (0-9), and underscores ( _ ). They must start with a letter. Also, their
length must be 100 characters or fewer.
7 Note
The matrix syntax doesn't support automatic job scaling but you can implement
similar functionality using the each keyword. For an example, see expressions.
Examples
YAML
jobs:
- job: Build
strategy:
matrix:
Python35:
PYTHON_VERSION: '3.5'
Python36:
PYTHON_VERSION: '3.6'
Python37:
PYTHON_VERSION: '3.7'
maxParallel: 2
This matrix creates three jobs: "Build Python35," "Build Python36," and "Build Python37."
Within each job, a variable named PYTHON_VERSION is available. In "Build Python35,"
the variable is set to "3.5". It's likewise set to "3.6" in "Build Python36." Only two jobs run
simultaneously.
strategy: parallel
The parallel job strategy specifies how many duplicates of a job should run.
YAML
strategy:
parallel: string # Run the job this many times.
Properties
parallel string.
Remarks
The parallel job strategy is useful for slicing up a large test matrix. The Visual Studio Test
task understands how to divide the test load across the number of scheduled jobs.
Examples
YAML
jobs:
- job: SliceItFourWays
strategy:
parallel: 4
jobs.job.uses definition
Article • 07/10/2023
Any resources required by this job that are not already referenced.
YAML
uses:
repositories: [ string ] # Repository references.
pools: [ string ] # Pool references.
Properties
repositories string list.
Repository references.
Pool references.
jobs.template definition
Article • 07/10/2023
YAML
jobs:
- template: string # Required as first property. Reference to a template for
this deployment.
parameters: # Parameters used in a deployment template.
Properties
template string. Required as first property.
Remarks
You can define a set of jobs in one file and use it multiple times in other files. See
templates for more about working with job templates.
Examples
In the main pipeline:
YAML
YAML
parameters: { string: any } # expected parameters
jobs: [ job ]
In this example, a single job is repeated on three platforms. The job itself is specified
only once.
YAML
# File: jobs/build.yml
parameters:
name: ''
pool: ''
sign: false
jobs:
- job: ${{ parameters.name }}
pool: ${{ parameters.pool }}
steps:
- script: npm install
- script: npm test
- ${{ if eq(parameters.sign, 'true') }}:
- script: sign
YAML
# File: azure-pipelines.yml
jobs:
- template: jobs/build.yml # Template reference
parameters:
name: macOS
pool:
vmImage: macOS-latest
YAML
List types
Type Description
See also
parameters.parameter
See templates for more about working with templates.
pipeline.parameters.parameter
definition
Article • 07/10/2023
YAML
parameters:
- name: string # Required as first property.
displayName: string # Human-readable name for the parameter.
type: string
default: string | parameters | [ parameters ]
values: [ string ]
Properties
name string. Required as first property.
displayName string.
type string.
default parameters.
Remarks
The type and name fields are required when defining parameters. See all parameter data
types.
YAML
parameters:
- name: string # name of the parameter; required
type: enum # see the enum data types in the following section
default: any # default value; if no default, then the parameter
MUST be given by the user at runtime
values: [ string ] # allowed list of values (for some data types)
Types
The type value must be one of the enum members from the following table.
string string
The step, stepList, job, jobList, deployment, deploymentList, stage, and stageList data
types all use standard YAML schema format. This example includes string, number,
boolean, object, step, and stepList.
YAML
parameters:
- name: myString
type: string
default: a string
- name: myMultiString
type: string
default: default
values:
- default
- ubuntu
- name: myNumber
type: number
default: 2
values:
- 1
- 2
- 4
- 8
- 16
- name: myBoolean
type: boolean
default: true
- name: myObject
type: object
default:
foo: FOO
bar: BAR
things:
- one
- two
- three
nested:
one: apple
two: pear
count: 3
- name: myStep
type: step
default:
script: echo my step
- name: mySteplist
type: stepList
default:
- script: echo step one
- script: echo step two
trigger: none
jobs:
- job: stepList
steps: ${{ parameters.mySteplist }}
- job: myStep
steps:
- ${{ parameters.myStep }}
Examples
YAML
# File: azure-pipelines.yml
parameters:
- name: image
displayName: Pool Image
type: string
default: ubuntu-latest
values:
- windows-latest
- ubuntu-latest
- macOS-latest
trigger: none
jobs:
- job: build
displayName: build
pool:
vmImage: ${{ parameters.image }}
steps:
- script: echo The image parameter is ${{ parameters.image }}```
You can use parameters to extend a template. In this example, the pipeline using the
template supplies the values to fill into the template.
YAML
# File: simple-param.yml
parameters:
- name: yesNo # name of the parameter; required
type: boolean # data type of the parameter; required
default: false
steps:
- script: echo ${{ parameters.yesNo }}
YAML
# File: azure-pipelines.yml
trigger:
- main
extends:
template: simple-param.yml
parameters:
yesNo: false # set to a non-boolean value to have the build fail
See also
See templates for more about working with templates.
pool definition
Article • 07/28/2023
The pool keyword specifies which pool to use for a job of the pipeline. A pool
specification also holds information about the job's strategy for running.
Implementations
Implementation Description
pool: name, demands, vmImage Full syntax for using demands and Microsoft-hosted pools.
Remarks
You can specify a pool at the pipeline, stage, or job level.
The pool specified at the lowest level of the hierarchy is used to run the job.
pool: string
Specify a private pool by name to use for a job of the pipeline.
YAML
pool string.
Remarks
Use this syntax to specify a private pool by name.
7 Note
If your pool name has a space in it, enclose the pool name in single quotes, like
pool: 'My pool' .
Examples
To use a private pool with no demands:
YAML
pool: MyPool
YAML
pool:
name: string # Name of a pool.
demands: string | [ string ] # Demands (for a private pool).
vmImage: string # Name of the VM image you want to use; valid only in the
Microsoft-hosted pool.
Properties
name string.
Name of a pool.
demands pool.demands.
vmImage string.
Name of the VM image you want to use; valid only in the Microsoft-hosted pool.
Remarks
Specify a Microsoft-hosted pool using the vmImage property.
If your self-hosted agent pool name has a space in it, enclose the pool name in single
quotes, like name: 'My pool' .
Examples
To use a Microsoft-hosted pool, omit the name and specify one of the available hosted
images:
YAML
pool:
vmImage: ubuntu-latest
You can specify demands for a private pool using the full syntax.
To add a single demand to your YAML build pipeline, add the demands: line to the pool
section.
YAML
pool:
name: Default
demands: SpecialSoftware # exists check for SpecialSoftware
YAML
pool:
name: MyPool
demands:
- myCustomCapability # exists check for myCustomCapability
- Agent.Version -equals 2.144.0 # equals check for Agent.Version 2.144.0
Checking for the existence of a capability (exists) and checking for a specific string in a
capability (equals) are the only two supported operations for demands.
Exists operation
The exists operation checks for the presence of a capability with the specific name. The
comparison is not case sensitive.
YAML
pool:
name: MyPool
demands: myCustomCapability # exists check for myCustomCapability
Equals operation
The equals operation checks for the existence of a capability, and if present, checks its
value with the specified value. If the capability is not present or the values don't match,
the operation evaluates to false. The comparisons are not case sensitive.
YAML
pool:
name: MyPool
demands: Agent.Version -equals 2.144.0 # equals check for Agent.Version
2.144.0
Agent.Name
Agent.Version
Agent.ComputerName
Agent.HomeDirectory
Agent.OS
Agent.OSArchitecture
Agent.OSVerion (Windows agents only)
See also
Specify demands
Learn more about conditions and timeouts.
pool.demands definition
Article • 07/28/2023
Implementations
Implementation Description
Remarks
Checking for the existence of a capability (exists) and checking for a specific string in a
capability (equals) are the only two supported operations for demands.
Exists operation
The exists operation checks for the presence of a capability with the specific name. The
comparison is not case sensitive.
YAML
pool:
name: MyPool
demands: myCustomCapability # exists check for myCustomCapability
Equals operation
The equals operation checks for the existence of a capability, and if present, checks its
value with the specified value. If the capability is not present or the values don't match,
the operation evaluates to false. The comparisons are not case sensitive.
YAML
pool:
name: MyPool
demands: Agent.Version -equals 2.144.0 # equals check for Agent.Version
2.144.0
Agent.Name
Agent.Version
Agent.ComputerName
Agent.HomeDirectory
Agent.OS
Agent.OSArchitecture
Agent.OSVerion (Windows agents only)
demands: string
Specify a demand for a private pool.
YAML
demands string.
Examples
To add a single demand to your YAML build pipeline, add the demands: line to the pool
section.
YAML
pool:
name: Default
demands: SpecialSoftware # exists check for SpecialSoftware
YAML
List types
Type Description
Examples
To specify multiple demands, add one per line.
YAML
pool:
name: MyPool
demands:
- myCustomCapability # exists check for myCustomCapability
- Agent.Version -equals 2.144.0 # equals check for Agent.Version 2.144.0
pr definition
Article • 07/10/2023
A pull request trigger specifies which branches cause a pull request build to run.
Implementations
Implementation Description
pr: autoCancel, branches, paths, drafts Full syntax for complete control.
Remarks
If you specify no pull request trigger, pull requests to any branch trigger a build.
There are three distinct syntax options for the pr keyword: a list of branches to include,
a way to disable PR triggers, and the full syntax for complete control.
) Important
YAML PR triggers are supported only in GitHub and Bitbucket Cloud. If you use
Azure Repos Git, you can configure a branch policy for build validation to trigger
your build pipeline for validation.
If you specify an exclude clause without an include clause for branches or paths , it is
equivalent to specifying * in the include clause.
pr: none
Disable pull request triggers.
YAML
Examples
Disablement syntax:
YAML
YAML
List types
Type Description
Remarks
The list syntax specifies a list of branches which trigger a run when a pull request is
raised or a push is made to the source branch of a raised pull request.
Examples
List syntax:
YAML
pr:
- main
- develop
pr: autoCancel, branches, paths, drafts
Use the full syntax when you need full control of the pull request trigger.
YAML
pr:
autoCancel: boolean # Whether to cancel running PR builds when a new
commit lands in the branch. Default: true.
branches: # Branch names to include or exclude for triggering a run.
include: [ string ] # List of items to include.
exclude: [ string ] # List of items to exclude.
paths: # File paths to include or exclude for triggering a run.
include: [ string ] # List of items to include.
exclude: [ string ] # List of items to exclude.
drafts: boolean # Whether to start a run when a draft PR is created.
Default: true.
Properties
autoCancel boolean.
Whether to cancel running PR builds when a new commit lands in the branch. Default:
true.
branches includeExcludeFilters.
paths includeExcludeFilters.
drafts boolean.
Examples
Full syntax:
YAML
pr:
branches:
include:
- features/*
exclude:
- features/experimental/*
paths:
exclude:
- README.md
See also
Learn more about pull request triggers and how to specify them.
resources definition
Article • 07/10/2023
Resources specifies builds, repositories, pipelines, and other resources used by the
pipeline.
YAML
resources:
builds: [ build ] # List of build resources referenced by the pipeline.
containers: [ container ] # List of container images.
pipelines: [ pipeline ] # List of pipeline resources.
repositories: [ repository ] # List of repository resources.
webhooks: [ webhook ] # List of webhooks.
packages: [ package ] # List of package resources.
Properties
builds resources.builds.
containers resources.containers.
pipelines resources.pipelines.
repositories resources.repositories.
webhooks resources.webhooks.
List of webhooks.
packages resources.packages.
See also
Add resources to a pipeline
resources.builds definition
Article • 07/10/2023
YAML
List types
Type Description
See also
Add resources to a pipeline
resources.builds.build definition
Article • 07/10/2023
YAML
builds:
- build: string # Required as first property. Alias or name of build
artifact.
type: string # Required. Name of the artifact type.
connection: string # Required. Name of the connection. This connection
will be used for all the communication related to this artifact.
source: string # Required. Name of the source definition/build/job.
version: string
branch: string
trigger: none | true # When the artifact mentioned in this build resource
completes a build, it is allowed to trigger this pipeline.
Properties
build string. Required as first property.
Name of the connection. This connection will be used for all the communication related
to this artifact.
version string.
branch string.
trigger string.
When the artifact mentioned in this build resource completes a build, it is allowed to
trigger this pipeline. none | true.
Remarks
If you have an external CI build system that produces artifacts, you can consume
artifacts with a build resource. A build resource can be any external CI systems like
Jenkins, TeamCity, CircleCI, and so on.
) Important
Triggers are only supported for hosted Jenkins where Azure DevOps has line of
sight with Jenkins server.
Examples
YAML
resources:
builds:
- build: Spaceworkz
type: Jenkins
connection: MyJenkinsServer
source: SpaceworkzProj # name of the jenkins source project
trigger: true
See also
Define resources in YAML
resources.containers definition
Article • 07/10/2023
YAML
List types
Type Description
See also
Add resources to a pipeline
resources.containers.container
definition
Article • 07/10/2023
YAML
containers:
- container: string # Required as first property. Alias of the container.
image: string # Required. Container image tag.
type: string # Type of the registry like ACR or GCR.
trigger: trigger | none | true # Specify none to disable, true to trigger
on all image tags, or use the full syntax as described in the following
examples.
endpoint: string # ID of the service endpoint connecting to a private
container registry.
env: # Variables to map into the container's environment.
string: string # Name/value pairs
mapDockerSocket: boolean # Set this flag to false to force the agent not
to setup the /var/run/docker.sock volume on container jobs.
options: string # Options to pass into container host.
ports: [ string ] # Ports to expose on the container.
volumes: [ string ] # Volumes to mount on the container.
mountReadOnly: # Volumes to mount read-only, the default is all false.
work: boolean # Mount the work directory as readonly.
externals: boolean # Mount the externals directory as readonly.
tools: boolean # Mount the tools directory as readonly.
tasks: boolean # Mount the tasks directory as readonly.
azureSubscription: string # Azure subscription (ARM service connection)
for container registry.
resourceGroup: string # Resource group for your ACR.
registry: string # Registry for container images.
repository: string # Name of the container image repository in ACR.
localImage: boolean # When true, uses a locally tagged image instead of
using docker pull to get the image. The default is false.
Properties
container string. Required as first property.
trigger resources.containers.container.trigger.
Specify none to disable, true to trigger on all image tags, or use the full syntax as
described in the following examples.
endpoint string.
mapDockerSocket boolean.
Set this flag to false to force the agent not to setup the /var/run/docker.sock volume on
container jobs.
options string.
mountReadOnly mountReadOnly.
azureSubscription string.
Azure subscription (ARM service connection) for container registry.
resourceGroup string.
registry string.
repository string.
localImage boolean.
When true, uses a locally tagged image instead of using docker pull to get the image.
The default is false.
This property is useful only for self-hosted agents where the image is already present on
the agent machine.
Remarks
Container jobs let you isolate your tools and dependencies inside a container.
The agent launches an instance of your specified container then runs steps inside it. The
container keyword lets you specify your container images.
Service containers run alongside a job to provide various dependencies like databases.
Template expressions are supported for endpoint , volumes , ports , and options
properties of a container resource in a YAML pipeline.
Examples
YAML
resources:
containers:
- container: linux
image: ubuntu:16.04
- container: windows
image: myprivate.azurecr.io/windowsservercore:1803
endpoint: my_acr_connection
- container: my_service
image: my_service:tag
ports:
- 8080:80 # bind container port 80 to 8080 on the host machine
- 6379 # bind container port 6379 to a random available port on the host
machine
volumes:
- /src/dir:/dst/dir # mount /src/dir on the host into /dst/dir in the
container
See also
Define resources in YAML
resources.containers.container.trigger
definition
Article • 07/10/2023
Implementations
Implementation Description
trigger: none | true Specify none to disable or true to trigger on all image tags.
Remarks
Specify none to disable, true to trigger on all image tags, or use the full syntax as
described in the following examples.
YAML
trigger:
enabled: boolean # Whether the trigger is enabled; defaults to true.
tags: includeExcludeStringFilters | [ string ] # Tag names to include or
exclude for triggering a run.
Properties
enabled boolean.
tags includeExcludeStringFilters.
YAML
resources:
containers:
- container: petStore
type: ACR
azureSubscription: ContosoARMConnection
resourceGroup: ContosoGroup
registry: petStoreRegistry
repository: myPets
trigger:
tags:
include:
- production*
YAML
Remarks
Specify none to disable the trigger, or true to enable. If not specified, the default is
none . To configure triggers based on specific tags, see the following section.
Examples
YAML
resources:
containers:
- container: petStore
type: ACR
azureSubscription: ContosoARMConnection
resourceGroup: ContosoGroup
registry: petStoreRegistry
repository: myPets
trigger:
tags: none # Triggers disabled
YAML
resources:
containers:
- container: petStore
type: ACR
azureSubscription: ContosoARMConnection
resourceGroup: ContosoGroup
registry: petStoreRegistry
repository: myPets
trigger:
tags: true # Triggers enabled for all tags
See also
Define a container resource
resources.packages definition
Article • 07/10/2023
YAML
List types
Type Description
See also
Add resources to a pipeline
resources.packages.package definition
Article • 07/10/2023
You can consume NuGet and npm GitHub packages as a resource in YAML pipelines.
When specifying package resources, set the package as NuGet or npm .
YAML
packages:
- package: string # Required as first property. Alias of package artifact.
type: string # Required. Type of the package. Ex - NuGet, NPM etc.
connection: string # Required. Name of the connection. This connection
will be used for all the communication related to this artifact.
name: string # Required. Name of the package.
version: string
tag: string
trigger: none | true # Trigger a new pipeline run when a new version of
this package is available.
Properties
package string. Required as first property.
Alias of package artifact. Acceptable values: [-_A-Za-z0-9]*.
Name of the connection. This connection will be used for all the communication related
to this artifact.
version string.
tag string.
trigger string.
Trigger a new pipeline run when a new version of this package is available. none | true.
Examples
In this example, there is an GitHub service connection named pat-contoso to a GitHub
npm package named contoso . Learn more about GitHub packages .
YAML
resources:
packages:
- package: contoso
type: npm
connection: pat-contoso
name: yourname/contoso
version: 7.130.88
trigger: true
pool:
vmImage: ubuntu-latest
steps:
- getPackage: contoso
See also
Add resources to a pipeline
resources.pipelines definition
Article • 07/10/2023
YAML
List types
Type Description
See also
Add resources to a pipeline
resources.pipelines.pipeline definition
Article • 07/10/2023
If you have an Azure Pipeline that produces artifacts, your pipeline can consume the
artifacts by defining a pipeline resource. In Azure DevOps Server 2020 and higher, you
can also enable pipeline completion triggers using a pipeline resource.
YAML
pipelines:
- pipeline: string # Required as first property. ID of the pipeline
resource.
project: string # Project for the source; defaults to current project.
source: string # Name of the pipeline that produces the artifact.
version: string # The pipeline run number to pick the artifact, defaults
to latest pipeline successful across all stages; used only for manual or
scheduled triggers.
branch: string # Branch to pick the artifact. Optional; defaults to all
branches, used only for manual or scheduled triggers.
tags: [ string ] # List of tags required on the pipeline to pickup default
artifacts. Optional; used only for manual or scheduled triggers.
trigger: # Specify none to disable, true to include all branches, or use
the full syntax as described in the following examples.
enabled: boolean # Whether the trigger is enabled; defaults to true.
branches: branches # Branches to include or exclude for triggering a
run.
stages: [ string ] # List of stages that when matched will trigger the
pipeline.
tags: [ string ] # List of tags that when matched will trigger the
pipeline.
Properties
pipeline string. Required as first property.
project string.
source string.
version string.
The pipeline run number to pick the artifact, defaults to latest pipeline successful across
all stages; used only for manual or scheduled triggers.
branch string.
Branch to pick the artifact. Optional; defaults to all branches, used only for manual or
scheduled triggers.
List of tags required on the pipeline to pickup default artifacts. Optional; used only for
manual or scheduled triggers.
trigger resources.pipelines.pipeline.trigger.
Specify none to disable, true to include all branches, or use the full syntax as described
in the following examples.
Remarks
7 Note
pipeline: specifies the name of the pipeline resource. Use the label defined here
when referring to the pipeline resource from other parts of the pipeline, such as
when using pipeline resource variables or downloading artifacts.
For more information about stages and tags in the pipeline resource trigger, see
pipeline-completion triggers.
7 Note
Pipeline completion triggers use the Default branch for manual and scheduled
builds setting to determine which branch's version of a YAML pipeline's branch
filters to evaluate when determining whether to run a pipeline as the result of
another pipeline completing. By default this setting points to the default branch of
the repository. For more information, see Pipeline completion triggers - branch
considerations.
There are several ways to define triggers in a pipeline resource. To trigger a run when
any run of the referenced pipeline completes, use trigger: true .
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger: true
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger: none
To configure branch filters, use the full syntax. Branch filters can be specified as a list of
branches to include, or as a list of branches to include combined with a list of branches
to exclude.
To specify a list of branches to include and exclude, use the following trigger syntax.
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger:
branches:
include:
- main
- develop
- features/*
exclude:
- features/experimental/*
To specify a list of branches to include, with no excludes, omit the exclude value, or use
the following syntax to specify the list of branches to include directly following
branches .
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger:
branches:
- main
- develop
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger:
branches: # Branches to include
tags: # List of tags that when matched will trigger the pipeline.
- release25
stages: # List of stages that when complete will trigger the pipeline.
- build
) Important
When you define a resource trigger, if its pipeline resource is from the same repo
as the current pipeline, triggering follows the same branch and commit on which
the event is raised. But if the pipeline resource is from a different repo, the current
pipeline is triggered on the branch specified by the Default branch for manual and
scheduled builds setting. For more information, see Branch considerations for
pipeline completion triggers.
YAML
resources.pipeline.<Alias>.projectName
resources.pipeline.<Alias>.projectID
resources.pipeline.<Alias>.pipelineName
resources.pipeline.<Alias>.pipelineID
resources.pipeline.<Alias>.runName
resources.pipeline.<Alias>.runID
resources.pipeline.<Alias>.runURI
resources.pipeline.<Alias>.sourceBranch
resources.pipeline.<Alias>.sourceCommit
resources.pipeline.<Alias>.sourceProvider
resources.pipeline.<Alias>.requestedFor
resources.pipeline.<Alias>.requestedForID
) Important
projectName is not present in the variables if the pipeline resource does not have a
project value specified. The project property is optional for pipeline resources
that reference a pipeline in the same project, but may be specified if desired.
Replace <Alias> with the ID of the pipeline resource. For the following pipeline
resource, the variable to access runID is resources.pipeline.source-pipeline.runID .
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
When a pipeline is triggered by one of its pipeline resources, the following variables are
set in addition to the variables in the previous list.
Variable Value
Build.Reason ResourceTrigger
Resource.TriggeringCategory pipeline
yml
resources:
pipelines:
- pipeline: source-pipeline
source: PipelineTriggerSource
project: FabrikamFiber
trigger: true
- pipeline: other-project-pipeline
source: PipelineTriggerFromOtherProject
project: FabrikamRepo
trigger: true
pool:
vmImage: ubuntu-latest
Whe this pipeline is run, the first bash task outputs the projectName of the the pipeline
resource named source-pipeline , which is FabrikamFiber .
The second bash task outputs all of the environment variables available to the task,
including the pipeline resource variables described in this section. Listing environment
variables isn't typically done in a production pipeline, but it can be useful for
troubleshooting. In this example there are two pipeline resources, and the output
contains the following two lines.
RESOURCES_PIPELINE_OTHER-PROJECT-PIPELINE_PROJECTNAME=FabrikamRepo
RESOURCES_PIPELINE_SOURCE-PIPELINE_PROJECTNAME=FabrikamFiber
7 Note
System and user-defined variables get injected as environment variables for your
platform. When variables convert into environment variables, variable names
become uppercase, and periods turn into underscores. For example, the variable
name any.variable becomes ANY_VARIABLE .
For more information about using variables and variable syntax, see Understand
variable syntax, Specify conditions, and Expressions.
You can consume artifacts from a pipeline resource by using a download task. See the
steps.download keyword.
Examples
YAML
resources:
pipelines:
- pipeline: MyAppA
source: MyCIPipelineA
- pipeline: MyAppB
source: MyCIPipelineB
trigger: true
- pipeline: MyAppC
project: DevOpsProject
source: MyCIPipelineC
branch: releases/M159
version: 20190718.2
trigger:
branches:
include:
- main
- releases/*
exclude:
- users/*
See also
Add resources to a pipeline
resources.pipelines.pipeline.trigger
definition
Article • 07/10/2023
Specify none to disable, true to include all branches, or use the full syntax as described
in the following examples.
Implementations
Implementation Description
trigger: enabled, branches, stages, Configure pipeline resource triggers using the full
tags syntax.
trigger: none | true Specify none to disable or true to include all branches.
Remarks
There are several ways to define triggers in a pipeline resource. To trigger a run when
any run of the referenced pipeline completes, use trigger: true .
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger: true
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger: none
To configure branch filters, use the full syntax. Branch filters can be specified as a list of
branches to include, or as a list of branches to include combined with a list of branches
to exclude.
To specify a list of branches to include and exclude, use the following trigger syntax.
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger:
branches:
include:
- main
- develop
- features/*
exclude:
- features/experimental/*
To specify a list of branches to include, with no excludes, omit the exclude value, or use
the following syntax to specify the list of branches to include directly following
branches .
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger:
branches:
- main
- develop
YAML
resources:
pipelines:
- pipeline: source-pipeline
source: TriggeringPipeline
trigger:
branches: # Branches to include
tags: # List of tags that when matched will trigger the pipeline.
- release25
stages: # List of stages that when complete will trigger the pipeline.
- build
) Important
When you define a resource trigger, if its pipeline resource is from the same repo
as the current pipeline, triggering follows the same branch and commit on which
the event is raised. But if the pipeline resource is from a different repo, the current
pipeline is triggered on the branch specified by the Default branch for manual and
scheduled builds setting. For more information, see Branch considerations for
pipeline completion triggers.
YAML
trigger:
enabled: boolean # Whether the trigger is enabled; defaults to true.
branches: branches # Branches to include or exclude for triggering a run.
stages: [ string ] # List of stages that when matched will trigger the
pipeline.
tags: [ string ] # List of tags that when matched will trigger the
pipeline.
Properties
enabled boolean.
branches resources.pipelines.pipeline.trigger.branches.
YAML
Implementations
Implementation Description
YAML
branches:
include: [ string ]
exclude: [ string ]
Properties
include string list.
YAML
Type Description
YAML
List types
Type Description
See also
Add resources to a pipeline
resources.repositories.repository
definition
Article • 07/28/2023
The repository keyword lets you specify an external repository. Use a repository
resource to reference an additional repository in your pipeline.
YAML
repositories:
- repository: string # Required as first property. Alias for the repository.
endpoint: string # ID of the service endpoint connecting to this
repository.
trigger: none | trigger | [ string ] # CI trigger for this repository, no
CI trigger if skipped (only works for Azure Repos).
name: string # repository name (format depends on 'type'; does not accept
variables).
ref: string # ref name to checkout; defaults to 'refs/heads/main'. The
branch checked out by default whenever the resource trigger fires.
type: string # Type of repository: git, github, githubenterprise, and
bitbucket.
Properties
repository string. Required as first property.
endpoint string.
trigger trigger.
CI trigger for this repository, no CI trigger if skipped (only works for Azure Repos).
name string.
ref string.
ref name to checkout; defaults to 'refs/heads/main'. The branch checked out by default
whenever the resource trigger fires. Template expressions are supported.
type string.
Remarks
Template expressions are supported for the ref property (but not the name property).
Wildcards are supported in triggers.
) Important
Repository resource triggers are supported for Azure Repos Git repositories only.
For more information on trigger syntax, including wildcard support for branches
and tags, see trigger definition and Build Azure Repos Git or TFS Git repositories.
) Important
If your pipeline has templates in another repository, or if you want to use multi-repo
checkout with a repository that requires a service connection, you must let the system
know about that repository.
Types
Pipelines support the following values for the repository type: git , github , and
bitbucket . The git type refers to Azure Repos Git repos.
If you specify type: git , the name value refers to the name of an Azure Repos Git
repository.
If your pipeline is in the same Azure DevOps project as the repository, for
example a repository named tools , you reference it using name: tools .
If your pipeline is in the same Azure DevOps organization as the repository, but
in a different Azure DevOps project, for example a project named ToolsProject ,
you must qualify the repository name with the project name: name:
ToolsProject/tools .
If you specify type: github , the name value is the full name of the GitHub repo and
includes the user or organization. An example is name: Microsoft/vscode . GitHub
repos require a GitHub service connection for authorization.
If you specify type: bitbucket , the name value is the full name of the Bitbucket
Cloud repo and includes the user or organization. An example is name:
MyBitbucket/vscode . Bitbucket Cloud repos require a Bitbucket Cloud service
For more information about these types, see Check out multiple repositories in your
pipeline - Repository resource definition.
Variables
In each run, the metadata for a repository resource is available to all jobs in the form of
runtime variables. The <Alias> is the identifier that you gave for your repository
resource.
YAML
resources.repositories.<Alias>.name
resources.repositories.<Alias>.ref
resources.repositories.<Alias>.type
resources.repositories.<Alias>.id
resources.repositories.<Alias>.url
The following example has a repository resource with an alias of common , and the
repository resource variables are accessed using resources.repositories.common.* .
YAML
resources:
repositories:
- repository: common
type: git
ref: main
name: Repo
variables:
ref: $[ resources.repositories.common.ref ]
name: $[ resources.repositories.common.name ]
id: $[ resources.repositories.common.id ]
type: $[ resources.repositories.common.type ]
url: $[ resources.repositories.common.url ]
steps:
- bash: |
echo "name = $(name)"
echo "ref = $(ref)"
echo "id = $(id)"
echo "type = $(type)"
echo "url = $(url)"
Examples
YAML
resources:
repositories:
- repository: common
type: github
name: Contoso/CommonTools
endpoint: MyContosoServiceConnection
See also
Add resources to a pipeline
resources.webhooks definition
Article • 07/10/2023
YAML
List types
Type Description
See also
Add resources to a pipeline
resources.webhooks.webhook definition
Article • 07/28/2023
A webhook resource enables you to integrate your pipeline with an external service to
automate the workflow.
YAML
webhooks:
- webhook: string # Required as first property. Name of the webhook.
connection: string # Required. Name of the connection. In case of offline
webhook this will be the type of Incoming Webhook otherwise it will be the
type of the webhook extension.
type: string # Name of the webhook extension. Leave this empty if it is an
offline webhook.
filters: [ filter ] # List of trigger filters.
Properties
webhook string. Required as first property.
Name of the webhook. Acceptable values: [-_A-Za-z0-9]*.
Name of the connection. In case of offline webhook this will be the type of Incoming
Webhook otherwise it will be the type of the webhook extension.
type string.
Name of the webhook extension. Leave this empty if it is an offline webhook.
filters resources.webhooks.webhook.filters.
Examples
You can define your pipeline as follows.
YAML
resources:
webhooks:
- webhook: WebHook
connection: IncomingWH
steps:
- script: echo ${{ parameters.WebHook.resource.message.title }}
To trigger your pipeline using the webhook, you need to make a POST request to
https://dev.azure.com/<org_name>/_apis/public/distributedtask/webhooks/<webhook_con
JSON
{
"resource": {
"message": {
"title": "Hello, world!",
"subtitle": "I'm using WebHooks!"
}
}
}
When you access data from the webhook's request body, be mindful that it may lead to
incorrect YAML. For example, if in the previous pipeline, your step reads - script: echo
${{ parameters.WebHook.resource.message }} , and you trigger the pipeline via a
webhook, the pipeline doesn't run. This is because in the process of replacing ${{
parameters.WebHook.resource.message.title }} with message , which contains the
JSON
{
"title": "Hello, world!",
"subtitle": "I'm using WebHooks!"
}
Because the generated YAML becomes invalid, no pipeline run is queued in response.
See also
Add resources to a pipeline
resources.webhooks.webhook.filters
definition
Article • 07/10/2023
YAML
List types
Type Description
YAML
filters:
- path: string # Required as first property. json path to select data from
event payload.
value: string # Required. Expected value for the filter to match.
Properties
path string. Required as first property.
Examples
For subscribing to a webhook event, you need to define a webhook resource in your
pipeline and point it to the Incoming webhook service connection. You can also define
additional filters on the webhook resource based on the JSON payload data to further
customize the triggers for each pipeline, and you can consume the payload data in the
form of variables in your jobs.
YAML
resources:
webhooks:
- webhook: MyWebhookTrigger ### Webhook alias
connection: MyWebhookConnection ### Incoming webhook service
connection
filters:
- path: repositoryName ### JSON path in the payload
value: maven-releases ### Expected value in the path provided
- path: action
value: CREATED
steps:
- task: PowerShell@2
inputs:
targetType: 'inline'
### JSON payload data is available in the form of ${{ parameters.
<WebhookAlias>.<JSONPath>}}
script: |
Write-Host ${{ parameters.MyWebhookTrigger.repositoryName}}
Write-Host ${{ parameters.MyWebhookTrigger.component.group}}
See also
Add resources to a pipeline
schedules definition
Article • 07/10/2023
The schedules list specifies the scheduled triggers for the pipeline.
YAML
List types
Type Description
Remarks
If you specify no scheduled trigger, no scheduled builds occur.
) Important
Scheduled triggers defined using the pipeline settings UI take precedence over
YAML scheduled triggers.
If your YAML pipeline has both YAML scheduled triggers and UI defined scheduled
triggers, only the UI defined scheduled triggers are run. To run the YAML defined
scheduled triggers in your YAML pipeline, you must remove the scheduled triggers
defined in the pipeline settings UI. Once all UI scheduled triggers are removed, a
push must be made in order for the YAML scheduled triggers to start being
evaluated.
See also
schedules.cron
Learn more about scheduled triggers.
Learn more about triggers in general and how to specify them.
schedules.cron definition
Article • 07/24/2023
YAML
schedules:
- cron: string # Required as first property. Cron syntax defining a schedule
in UTC time.
displayName: string # Optional friendly name given to a specific schedule.
branches: # Branch names to include or exclude for triggering a run.
include: [ string ] # List of items to include.
exclude: [ string ] # List of items to exclude.
batch: boolean # Whether to run the pipeline if the previously scheduled
run is in-progress; the default is false.
always: boolean # Whether to always run the pipeline or only if there have
been source code changes since the last successful scheduled run; the
default is false.
Properties
cron string. Required as first property.
displayName string.
branches includeExcludeFilters.
batch boolean.
Whether to run the pipeline if the previously scheduled run is in-progress; the default is
false . This is regardless of the version of the pipeline repository.
false false Pipeline runs only if there's a change with respect to the last successful
scheduled pipeline run.
Always Batch Behavior
false true Pipeline runs only if there's a change with respect to the last successful
scheduled pipeline run, and there's no in-progress scheduled pipeline run.
) Important
When always is true , the pipeline runs according to the cron schedule, even when
batch is true .
always boolean.
Whether to always run the pipeline or only if there have been source code changes
since the last successful scheduled run; the default is false.
Remarks
If you specify no scheduled trigger, no scheduled builds occur.
7 Note
) Important
Scheduled triggers defined using the pipeline settings UI take precedence over
YAML scheduled triggers.
If your YAML pipeline has both YAML scheduled triggers and UI defined scheduled
triggers, only the UI defined scheduled triggers are run. To run the YAML defined
scheduled triggers in your YAML pipeline, you must remove the scheduled triggers
defined in the pipeline settings UI. Once all UI scheduled triggers are removed, a
push must be made in order for the YAML scheduled triggers to start being
evaluated.
To delete UI scheduled triggers from a YAML pipeline, see UI settings override
YAML scheduled triggers.
Build.CronSchedule.DisplayName variable
When a pipeline is running due to a cron scheduled trigger, the pre-defined
Build.CronSchedule.DisplayName variable contains the displayName of the cron schedule
Your YAML pipeline may contain multiple cron schedules, and you may want your
pipeline to run different stages or jobs based on which cron schedule runs. For example,
you have a nightly build and a weekly build, and you want to run a certain stage only
during the nightly build. You can use the Build.CronSchedule.DisplayName variable in a
job or stage condition to determine whether to run that job or stage.
yml
- stage: stage1
# Run this stage only when the pipeline is triggered by the
# "Daily midnight build" cron schedule
condition: eq(variables['Build.CronSchedule.DisplayName'], 'Daily midnight
build')
Examples
The following example defines two schedules.
The first schedule, Daily midnight build, runs a pipeline at midnight every day only if
the code has changed since the last successful scheduled run. It runs the pipeline for
main and all releases/* branches, except for those branches under releases/ancient/* .
The second schedule, Weekly Sunday build, runs a pipeline at noon on Sundays for all
releases/* branches. It does so regardless of whether the code has changed since the
last run.
YAML
schedules:
- cron: '0 0 * * *'
displayName: Daily midnight build
branches:
include:
- main
- releases/*
exclude:
- releases/ancient/*
- cron: '0 12 * * 0'
displayName: Weekly Sunday build
branches:
include:
- releases/*
always: true
and job3 only runs if the pipeline was triggered by the Weekly Sunday build schedule.
yml
stages:
- stage: stage1
# Run this stage only when the pipeline is triggered by the
# "Daily midnight build" cron schedule
condition: eq(variables['Build.CronSchedule.DisplayName'], 'Daily midnight
build')
jobs:
- job: job1
steps:
- script: echo Hello from Stage 1 Job 1
- stage: stage2
dependsOn: [] # Indicate this stage does not depend on the previous stage
jobs:
- job: job2
steps:
- script: echo Hello from Stage 2 Job 2
- job: job3
# Run this job only when the pipeline is triggered by the
# "Weekly Sunday build" cron schedule
condition: eq(variables['Build.CronSchedule.DisplayName'], 'Weekly
Sunday build')
steps:
- script: echo Hello from Stage 2 Job 3
See also
Learn more about scheduled triggers.
Learn more about triggers in general and how to specify them.
stages definition
Article • 07/10/2023
YAML
List types
Type Description
stages.template You can define a set of stages in one file and use it multiple times in other files.
Remarks
By default, stages run sequentially. Each stage starts only after the preceding stage is
complete unless otherwise specified via the dependsOn property.
Use approval checks to manually control when a stage should run. These checks are
commonly used to control deployments to production environments.
Checks are a mechanism available to the resource owner. They control when a stage in a
pipeline consumes a resource. As an owner of a resource like an environment, you can
define checks that are required before a stage that consumes the resource can start.
Examples
This example runs three stages, one after another. The middle stage runs two jobs in
parallel.
YAML
stages:
- stage: Build
jobs:
- job: BuildJob
steps:
- script: echo Building!
- stage: Test
jobs:
- job: TestOnWindows
steps:
- script: echo Testing on Windows!
- job: TestOnLinux
steps:
- script: echo Testing on Linux!
- stage: Deploy
jobs:
- job: Deploy
steps:
- script: echo Deploying the code!
This example runs two stages in parallel. For brevity, the jobs and steps are omitted.
YAML
stages:
- stage: BuildWin
displayName: Build for Windows
- stage: BuildMac
displayName: Build for Mac
dependsOn: [] # by specifying an empty array, this stage doesn't depend on
the stage before it
See also
Learn more about stages, conditions, and variables.
stages.stage definition
Article • 07/10/2023
Stages are a collection of related jobs. By default, stages run sequentially. Each stage
starts only after the preceding stage is complete unless otherwise specified via the
dependsOn property.
YAML
stages:
- stage: string # Required as first property. ID of the stage.
displayName: string # Human-readable name for the stage.
pool: string | pool # Pool where jobs in this stage will run unless
otherwise specified.
dependsOn: string | [ string ] # Any stages which must complete before
this one.
condition: string # Evaluate this condition expression to determine
whether to run this stage.
variables: variables | [ variable ] # Stage-specific variables.
jobs: [ job | deployment | template ] # Jobs which make up the stage.
lockBehavior: string # Behavior lock requests from this stage should
exhibit in relation to other exclusive lock requests.
templateContext: # Stage related information passed from a pipeline when
extending a template.
Properties
stage string. Required as first property.
ID of the stage.
displayName string.
pool pool.
Pool where jobs in this stage will run unless otherwise specified.
Any stages which must complete before this one. By default stages are run sequentially
in the order defined in the pipeline. Specify dependsOn: [] for a stage if it shouldn't
depend on the previous stage in the pipeline.
condition string.
variables variables.
Stage-specific variables.
jobs jobs.
lockBehavior string.
Behavior lock requests from this stage should exhibit in relation to other exclusive lock
requests. sequential | runLatest.
templateContext templateContext.
Stage related information passed from a pipeline when extending a template. For more
information about templateContext , see Extended YAML Pipelines templates can now be
passed context information for stages, jobs, and deployments and Templates - Use
templateContext to pass properties to templates.
Remarks
For more information about templateContext , see Extended YAML Pipelines templates
can now be passed context information for stages, jobs, and deployments and
Templates - Use templateContext to pass properties to templates.
Use approval checks to manually control when a stage should run. These checks are
commonly used to control deployments to production environments.
Checks are a mechanism available to the resource owner. They control when a stage in a
pipeline consumes a resource. As an owner of a resource like an environment, you can
define checks that are required before a stage that consumes the resource can start.
Exclusive lock
In YAML pipelines, checks are used to control the execution of stages on protected
resources. One of the common checks that you can use is an exclusive lock check. This
check lets only a single run from the pipeline proceed. When multiple runs attempt to
deploy to an environment at the same time, the check cancels all the old runs and
permits the latest run to be deployed.
You can configure the behavior of the exclusive lock check using the lockBehavior
property, which has two values:
runLatest - Only the latest run acquires the lock to the resource. This is the default
Canceling old runs is a good approach when your releases are cumulative and contain
all the code changes from previous runs. However, there are some pipelines in which
code changes are not cumulative. By configuring the lockBehavior property, you can
choose to allow all runs to proceed and deploy sequentially to an environment, or
preserve the previous behavior of canceling old runs and allowing just the latest. A value
of sequential implies that all runs acquire the lock sequentially to the protected
resource. A value of runLatest implies that only the latest run acquires the lock to the
resource.
To use exclusive lock check with sequential deployments or runLatest , follow these
steps:
1. Enable the exclusive lock check on the environment (or another protected
resource).
2. In the YAML file for the pipeline, specify a new property called lockBehavior . This
can be specified for the whole pipeline or for a given stage:
Set on a stage:
YAML
stages:
- stage: A
lockBehavior: sequential
jobs:
- job: Job
steps:
- script: Hey!
YAML
lockBehavior: runLatest
stages:
- stage: A
jobs:
- job: Job
steps:
- script: Hey!
Examples
This example runs three stages, one after another. The middle stage runs two jobs in
parallel.
YAML
stages:
- stage: Build
jobs:
- job: BuildJob
steps:
- script: echo Building!
- stage: Test
jobs:
- job: TestOnWindows
steps:
- script: echo Testing on Windows!
- job: TestOnLinux
steps:
- script: echo Testing on Linux!
- stage: Deploy
jobs:
- job: Deploy
steps:
- script: echo Deploying the code!
This example runs two stages in parallel. For brevity, the jobs and steps are omitted.
YAML
stages:
- stage: BuildWin
displayName: Build for Windows
- stage: BuildMac
displayName: Build for Mac
dependsOn: [] # by specifying an empty array, this stage doesn't depend on
the stage before it
See also
Learn more about stages, conditions, and variables.
stages.template definition
Article • 07/10/2023
You can define a set of stages in one file and use it multiple times in other files.
YAML
stages:
- template: string # Required as first property. Reference to a template for
this stage.
parameters: # Parameters used in a stage template.
Properties
template string. Required as first property.
Remarks
Reference the stage template in the main pipeline.
YAML
YAML
Examples
In this example, a stage is repeated twice for two different testing regimes. The stage
itself is specified only once.
YAML
# File: stages/test.yml
parameters:
name: ''
testFile: ''
stages:
- stage: Test_${{ parameters.name }}
jobs:
- job: ${{ parameters.name }}_Windows
pool:
vmImage: windows-latest
steps:
- script: npm install
- script: npm test -- --file=${{ parameters.testFile }}
- job: ${{ parameters.name }}_Mac
pool:
vmImage: macos-latest
steps:
- script: npm install
- script: npm test -- --file=${{ parameters.testFile }}
YAML
# File: azure-pipelines.yml
stages:
- template: stages/test.yml # Template reference
parameters:
name: Mini
testFile: tests/miniSuite.js
See also
Template types & usage
Security through templates
steps definition
Article • 07/10/2023
YAML
List types
Type Description
steps.script Runs a script using cmd.exe on Windows and Bash on other platforms.
steps.powershell Runs a script using either Windows PowerShell (on Windows) or pwsh
(Linux and macOS).
steps.download Downloads artifacts associated with the current run or from another Azure
Pipeline that is associated as a pipeline resource.
steps.publish Publishes (uploads) a file or folder as a pipeline artifact that other jobs and
pipelines can consume.
steps.template Define a set of steps in one file and use it multiple times in another file.
All tasks and steps support a set of common properties, such as enabled and env ,in
addition to their task or step specific properties. For more information on configuring
these properties, see Task control options and Task environment variables.
Examples
YAML
steps:
- script: echo This runs in the default shell on any machine
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
- pwsh: |
Write-Host "This multiline script always runs in PowerShell Core."
Write-Host "Even on non-Windows machines!"
See also
Specify jobs in your pipeline
Task types and usage
steps.bash definition
Article • 07/10/2023
The bash step runs a script in Bash on Windows, macOS, and Linux.
YAML
steps:
- bash: string # Required as first property. An inline script.
failOnStderr: string # Fail the task if output is sent to Stderr?
workingDirectory: string # Start the script with this working directory.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
bash string. Required as first property.
An inline script.
failOnStderr string.
workingDirectory string.
condition string.
continueOnError boolean.
displayName string.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The bash keyword is a shortcut for the shell script task. The task runs a script in Bash on
Windows, macOS, and Linux.
Examples
YAML
steps:
- bash: |
which bash
echo Hello $name
displayName: Multiline Bash script
env:
name: Microsoft
If you don't specify a command mode, you can shorten the target structure to:
YAML
- bash:
target: string # container name or the word 'host'
See also
shell script task
Learn more about conditions, timeouts, and step targets
steps.checkout definition
Article • 07/10/2023
Use checkout to configure how the pipeline checks out source code.
YAML
steps:
- checkout: string # Required as first property. Configures checkout for the
specified repository.
clean: string # If true, run git clean -ffdx followed by git reset --hard
HEAD before fetching.
fetchDepth: string # Depth of Git graph to fetch.
fetchTags: string # Set to 'true' to sync tags when fetching the repo, or
'false' to not sync tags. See remarks for the default behavior.
lfs: string # Set to 'true' to download Git-LFS files. Default is not to
download them.
persistCredentials: string # Set to 'true' to leave the OAuth token in the
Git config after the initial fetch. The default is not to leave it.
submodules: string # Set to 'true' for a single level of submodules or
'recursive' to get submodules of submodules. Default is not to fetch
submodules.
path: string # Where to put the repository. The root directory is
$(Pipeline.Workspace).
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
checkout string. Required as first property.
Configures checkout for the specified repository. Specify self , none , repository name,
or repository resource. For more information, see Check out multiple repositories in your
pipeline.
7 Note
clean string.
If true, run git clean -ffdx followed by git reset --hard HEAD before fetching. true | false.
fetchDepth string.
fetchTags string.
Set to 'true' to sync tags when fetching the repo, or 'false' to not sync tags. See remarks
for the default behavior.
lfs string.
persistCredentials string.
Set to 'true' to leave the OAuth token in the Git config after the initial fetch. The default
is not to leave it.
submodules string.
path string.
condition string.
Evaluate this condition expression to determine whether to run this task.
continueOnError boolean.
displayName string.
target target.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
Shallow fetch
Clean property
Sync tags
Shallow fetch
) Important
New pipelines created after the September 2022 Azure DevOps sprint 209 update
have Shallow fetch enabled by default and configured with a depth of 1. Previously
the default was not to shallow fetch. To check your pipeline, view the Shallow fetch
setting in the pipeline settings UI.
To disable shallow fetch, you can perform one of the following two options.
7 Note
If you explicitly set fetchDepth in your checkout step, that setting takes priority
over the setting configured in the pipeline settings UI. Setting fetchDepth: 0
fetches all history and overrides the Shallow fetch setting.
Clean property
If the clean property is unset, then its default value is configured by the clean setting in
the UI settings for YAML pipelines, which is set to true by default. In addition to the
cleaning option available using checkout , you can also configure cleaning in a
workspace. For more information about workspaces and clean options, see the
workspace topic in Jobs.
Sync tags
The checkout step uses the --tags option when fetching the contents of a Git
repository. This causes the server to fetch all tags as well as all objects that are pointed
to by those tags. This increases the time to run the task in a pipeline, particularly if you
have a large repository with a number of tags. Furthermore, the checkout step syncs
tags even when you enable the shallow fetch option, thereby possibly defeating its
purpose. To reduce the amount of data fetched or pulled from a Git repository,
Microsoft has added a new option to checkout to control the behavior of syncing tags.
This option is available both in classic and YAML pipelines.
Whether to synchronize tags when checking out a repository can be configured in YAML
by setting the fetchTags property, and in the UI by configuring the Sync tags setting.
YAML
steps:
- checkout: self
fetchTags: true
To configure the setting in the pipeline UI, edit your YAML pipeline, and choose More
actions, Triggers, YAML, Get sources, and check or uncheck the Sync tags checkbox. For
more information, see Sync tags.
Default behavior
For existing pipelines created before the release of Azure DevOps sprint 209,
released in September 2022, the default for syncing tags remains the same as the
existing behavior before the Sync tags options was added, which is true .
For new pipelines created after Azure DevOps sprint release 209, the default for
syncing tags is false .
) Important
A Sync tags setting of true in the UI takes precedence over a fetchTags: false
statement in the YAML. If Sync tags is set to true in the UI, tags are synced even if
fetchTags is set to false in the YAML.
Examples
To avoid syncing sources at all:
YAML
steps:
- checkout: none
7 Note
If you're running the agent in the Local Service account and want to modify the
current repository by using git operations or loading git submodules, give the
proper permissions to the Project Collection Build Service Accounts user.
YAML
- checkout: self
submodules: true
persistCredentials: true
To check out multiple repositories in your pipeline, use multiple checkout steps:
YAML
- checkout: self
- checkout: git://MyProject/MyRepo
- checkout: MyGitHubRepo # Repo declared in a repository resource
For more information, see Check out multiple repositories in your pipeline.
See also
Supported source repositories
steps.download definition
Article • 07/10/2023
The download step downloads artifacts associated with the current run or from another
Azure Pipeline that is associated as a pipeline resource.
YAML
steps:
- download: string # Required as first property. Specify current, pipeline
resource identifier, or none to disable automatic download.
artifact: string # Artifact name.
patterns: string # Pattern to download files from artifact.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
download string. Required as first property.
artifact string.
Artifact name.
patterns string.
condition string.
continueOnError boolean.
target target.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The download keyword is a shortcut for the Download Pipeline Artifacts task.
Examples
YAML
steps:
- download: current # refers to artifacts published by current pipeline
artifact: WebApp
patterns: '**/.js'
displayName: Download artifact WebApp
- download: MyAppA # downloads artifacts available as part of the pipeline
resource specified as MyAppA
See also
Publish and download pipeline Artifacts
Download Pipeline Artifacts task
steps.downloadBuild definition
Article • 07/10/2023
YAML
steps:
- downloadBuild: string # Required as first property. Alias of the build
resource.
artifact: string # Artifact name.
path: string # Path to download the artifact into.
patterns: string # Downloads the files which matches the patterns.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
downloadBuild string. Required as first property.
artifact string.
Artifact name.
path string.
patterns string.
condition string.
Evaluate this condition expression to determine whether to run this task.
continueOnError boolean.
displayName string.
target target.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The downloadBuild keyword is a shortcut for the Download Build Artifacts task.
7 Note
The getPackage step downloads a package from a package management feed in Azure
Artifacts or Azure DevOps Server.
YAML
steps:
- getPackage: string # Required as first property. Alias of the package
resource.
path: string # Path to download the package into.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
getPackage string. Required as first property.
path string.
condition string.
continueOnError boolean.
displayName string.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The getPackage keyword is a shortcut for the Download Package task.
See also
Download Package task
steps.powershell definition
Article • 07/10/2023
The powershell step runs a script using either Windows PowerShell (on Windows) or
pwsh (Linux and macOS).
YAML
steps:
- powershell: string # Required as first property. Inline PowerShell or
reference to a PowerShell file.
errorActionPreference: string # Unless otherwise specified, the error
action preference defaults to the value stop. See the following section for
more information.
failOnStderr: string # Fail the task if output is sent to Stderr?
ignoreLASTEXITCODE: string # Check the final exit code of the script to
determine whether the step succeeded?
workingDirectory: string # Start the script with this working directory.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
powershell string. Required as first property.
errorActionPreference string.
Unless otherwise specified, the error action preference defaults to the value stop. See
the following section for more information.
failOnStderr string.
Check the final exit code of the script to determine whether the step succeeded?
workingDirectory string.
condition string.
continueOnError boolean.
displayName string.
target target.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The powershell keyword is a shortcut for the PowerShell task. The task runs a script
using either Windows PowerShell (on Windows) or pwsh (Linux and macOS).
Each PowerShell session lasts only for the duration of the job in which it runs. Tasks that
depend on what has been bootstrapped must be in the same job as the bootstrap.
When the error action preference is set to stop, errors cause PowerShell to terminate the
task and return a nonzero exit code. The task is also marked as Failed.
YAML
YAML
steps:
- powershell: |
Write-Error 'Uh oh, an error occurred'
Write-Host 'Trying again...'
displayName: Error action preference
errorActionPreference: continue
YAML
ignoreLASTEXITCODE: boolean
YAML
steps:
- powershell: git nosuchcommand
displayName: Ignore last exit code
ignoreLASTEXITCODE: true
Examples
YAML
steps:
- powershell: Write-Host Hello $(name)
displayName: Say hello
name: firstStep
workingDirectory: $(build.sourcesDirectory)
failOnStderr: true
env:
name: Microsoft
See also
PowerShell task
Learn more about conditions and timeouts
steps.publish definition
Article • 07/10/2023
The publish keyword publishes (uploads) a file or folder as a pipeline artifact that other
jobs and pipelines can consume.
YAML
steps:
- publish: string # Required as first property. The publish step is a
shortcut for the PublishPipelineArtifact@1 task. The task publishes
(uploads) a file or folder as a pipeline artifact that other jobs and
pipelines can consume.
artifact: string # Artifact name.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
publish string. Required as first property.
The publish step is a shortcut for the PublishPipelineArtifact@1 task. The task publishes
(uploads) a file or folder as a pipeline artifact that other jobs and pipelines can consume.
artifact string.
Artifact name.
condition string.
continueOnError boolean.
Continue running even on failure?
displayName string.
target target.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The publish keyword is a shortcut for the Publish Pipeline Artifact task.
Examples
YAML
steps:
- publish: $(Build.SourcesDirectory)/build
artifact: WebApp
displayName: Publish artifact WebApp
See also
Publish Pipeline Artifact task
Publishing artifacts
steps.pwsh definition
Article • 07/10/2023
The pwsh step runs a script in PowerShell Core on Windows, macOS, and Linux.
YAML
steps:
- pwsh: string # Required as first property. Inline PowerShell or reference
to a PowerShell file.
errorActionPreference: string # Unless otherwise specified, the error
action preference defaults to the value stop. See the following section for
more information.
failOnStderr: string # Fail the task if output is sent to Stderr?
ignoreLASTEXITCODE: string # Check the final exit code of the script to
determine whether the step succeeded?
workingDirectory: string # Start the script with this working directory.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
pwsh string. Required as first property.
errorActionPreference string.
Unless otherwise specified, the error action preference defaults to the value stop. See
the following section for more information.
failOnStderr string.
ignoreLASTEXITCODE string.
Check the final exit code of the script to determine whether the step succeeded?
workingDirectory string.
condition string.
continueOnError boolean.
displayName string.
target target.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The pwsh keyword is a shortcut for the PowerShell task when that task's pwsh value is
set to true. The task runs a script in PowerShell Core on Windows, macOS, and Linux.
Learn more about conditions and timeouts.
Each PowerShell session lasts only for the duration of the job in which it runs. Tasks that
depend on what has been bootstrapped must be in the same job as the bootstrap.
Examples
YAML
steps:
- pwsh: Write-Host Hello $(name)
displayName: Say hello
name: firstStep
workingDirectory: $(build.sourcesDirectory)
failOnStderr: true
env:
name: Microsoft
See also
PowerShell task
Learn more about conditions and timeouts
steps.reviewApp definition
Article • 07/10/2023
The reviewApp step creates a resource dynamically under a deploy phase provider.
YAML
steps:
- reviewApp: string # Required as first property. Use this task under deploy
phase provider to create a resource dynamically.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
reviewApp string. Required as first property.
Use this task under deploy phase provider to create a resource dynamically.
condition string.
continueOnError boolean.
displayName string.
target target.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The reviewApp keyword is a shortcut for the Review App task.
See also
Review App task
steps.script definition
Article • 07/10/2023
The script step runs a script using cmd.exe on Windows and Bash on other platforms.
YAML
steps:
- script: string # Required as first property. An inline script.
failOnStderr: string # Fail the task if output is sent to Stderr?
workingDirectory: string # Start the script with this working directory.
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
script string. Required as first property.
An inline script.
failOnStderr string.
workingDirectory string.
condition string.
continueOnError boolean.
displayName string.
enabled boolean.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it.
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
The script keyword is a shortcut for the command-line task. The task runs a script
using cmd.exe on Windows and Bash on other platforms.
Examples
If you don't specify a command mode, you can shorten the target structure to:
YAML
- script:
target: string # container name or the word 'host'
YAML
steps:
- script: echo Hello world!
displayName: Say hello
See also
command-line task
steps.task definition
Article • 07/10/2023
YAML
steps:
- task: string # Required as first property. Name of the task to run.
inputs: # Inputs for the task.
string: string # Name/value pairs
condition: string # Evaluate this condition expression to determine
whether to run this task.
continueOnError: boolean # Continue running even on failure?
displayName: string # Human-readable name for the task.
target: string | target # Environment in which to run this task.
enabled: boolean # Run this task when the job runs?
env: # Variables to map into the process's environment.
string: string # Name/value pairs
name: string # ID of the step.
timeoutInMinutes: string # Time to wait for this task to complete before
the server kills it.
retryCountOnTaskFailure: string # Number of retries if the task fails.
Properties
task string. Required as first property.
condition string.
continueOnError boolean.
displayName string.
target target.
name string.
timeoutInMinutes string.
Time to wait for this task to complete before the server kills it. For example, to configure
a 10 minute timeout, use timeoutInMinutes: 10 .
7 Note
Pipelines may be configured with a job level timeout. If the job level timeout
interval elapses before your step completes, the running job (including your step) is
terminated, even if the step is configured with a longer timeoutInMinutes interval.
For more information, see Timeouts.
retryCountOnTaskFailure string.
Remarks
Tasks are the building blocks of a pipeline. There's a catalog of tasks available to choose
from.
If you don't specify a command mode, you can shorten the target structure to:
YAML
- task:
target: string # container name or the word 'host'
Examples
YAML
steps:
- task: VSBuild@1
displayName: Build
timeoutInMinutes: 120
inputs:
solution: '**\*.sln'
See also
Tasks
Catalog of tasks
steps.template definition
Article • 07/10/2023
Define a set of steps in one file and use it multiple times in another file.
YAML
steps:
- template: string # Required as first property. Reference to a template for
this step.
parameters: # Parameters used in a step template.
Properties
template string. Required as first property.
Examples
In the main pipeline:
YAML
steps:
- template: string # reference to template
parameters: { string: any } # provided parameters
YAML
YAML
# File: steps/build.yml
steps:
- script: npm install
- script: npm test
YAML
# File: azure-pipelines.yml
jobs:
- job: macOS
pool:
vmImage: macOS-latest
steps:
- template: steps/build.yml # Template reference
- job: Linux
pool:
vmImage: ubuntu-latest
steps:
- template: steps/build.yml # Template reference
- job: Windows
pool:
vmImage: windows-latest
steps:
- template: steps/build.yml # Template reference
- script: sign # Extra step on Windows only
See also
See templates for more about working with templates.
target definition
Article • 07/10/2023
Tasks run in an execution context, which is either the agent host or a container.
Implementations
Implementation Description
target: container, commands, Configure step target with environment, and allowed list of
settableVariables commands and variables.
Remarks
An individual step may override its context by specifying a target , and optionally
configure a container, commands, and settable variables.
target: string
Specify a step target by name.
YAML
target string.
Available options are the word host to target the agent host plus any containers
defined in the pipeline.
target:
container: string # Container to target (or 'host' for host machine).
commands: string # Set of allowed logging commands ('any' or
'restricted').
settableVariables: none | [ string ] # Restrictions on which variables
that can be set.
Properties
container string.
commands string.
settableVariables target.settableVariables.
Remarks
You don't need to configure all of these properties when configuring a step target. If not
specified, the default value for container is host , the default value of commands is any ,
and the default value for settableVariables allows all variables to be set by a step.
Azure Pipelines supports running jobs either in containers or on the agent host.
Previously, an entire job was set to one of those two targets. Now, individual steps (tasks
or scripts) can run on the target you choose. Steps may also target other containers, so a
pipeline could run each step in a specialized, purpose-built container.
7 Note
This feature is in public preview. If you have any feedback or questions about this
feature, let us know in the Developer Community .
Containers can act as isolation boundaries, preventing code from making unexpected
changes on the host machine. The way steps communicate with and access services
from the agent is not affected by isolating steps in a container. Therefore, we're also
introducing a command restriction mode which you can use with step targets. Setting
commands to restricted will restrict the services a step can request from the agent. It
will no longer be able to attach logs, upload artifacts, and certain other operations.
Examples
The following example shows running steps on the host in a job container, and in
another container.
YAML
resources:
containers:
- container: python
image: python:3.8
- container: node
image: node:13.2
jobs:
- job: example
container: python
steps:
- script: echo Running in the job container
See also
Task types & usage - step target
target.settableVariables definition
Article • 07/10/2023
Implementations
Implementation Description
Remarks
You can disable setting all variables for a step, or restrict the settable variables to a list. If
the settableVariables property is not set, the default allows all variables to be set by a
step.
settableVariables: none
Disable a step from setting any variables.
YAML
Examples
YAML
steps:
- script: echo This is a step
target:
settableVariables: none
YAML
List types
Type Description
Examples
In the following example, the bash step can only set the value of the sauce variable.
When the pipeline runs, the secretSauce variable is not set, and a warning is displayed
on the pipeline run page.
YAML
steps:
- bash: |
echo "##vso[task.setvariable variable=sauce;]crushed tomatoes"
echo "##vso[task.setvariable variable=secretSauce;]crushed tomatoes
with garlic"
target:
settableVariables:
- sauce
name: SetVars
- bash:
echo "Sauce is $(sauce)"
echo "secretSauce is $(secretSauce)"
name: OutputVars
See also
Configure settable variables for steps
trigger definition
Article • 07/10/2023
A push trigger specifies which branches cause a continuous integration build to run.
Implementations
Implementation Description
trigger: batch, branches, paths, tags Full syntax for complete control.
Remarks
For more information about using triggers with a specific repository type, see Supported
source repositories.
There are three distinct syntax options for the trigger keyword: a list of branches to
include, a way to disable CI triggers, and the full syntax for complete control.
If you specify an exclude clause without an include clause for branches , tags , or
paths , it is equivalent to specifying * in the include clause.
trigger: none
Disable CI triggers.
YAML
Disable CI triggers.
Examples
YAML
YAML
List types
Type Description
Examples
YAML
trigger:
- main
- develop
YAML
trigger:
batch: boolean # Whether to batch changes per branch.
branches: # Branch names to include or exclude for triggering a run.
include: [ string ] # List of items to include.
exclude: [ string ] # List of items to exclude.
paths: # File paths to include or exclude for triggering a run.
include: [ string ] # List of items to include.
exclude: [ string ] # List of items to exclude.
tags: # Tag names to include or exclude for triggering a run.
include: [ string ] # List of items to include.
exclude: [ string ] # List of items to exclude.
Properties
batch boolean.
branches includeExcludeFilters.
paths includeExcludeFilters.
tags includeExcludeFilters.
Remarks
If you have many team members uploading changes often, you may want to reduce the
number of runs you start. If you set batch to true , when a pipeline is running, the
system waits until the run is completed, then starts another run with all changes that
have not yet been built. By default, batch is false .
7 Note
For more information, see Triggers - CI triggers and choose your repository type.
YAML
Examples
YAML
trigger:
batch: true
branches:
include:
- features/*
exclude:
- features/experimental/*
paths:
exclude:
- README.md
See also
Learn more about triggers and how to specify them.
variables definition
Article • 07/10/2023
Implementations
Implementation Description
Remarks
The variables keyword uses two syntax forms: variable list and mapping (string
dictionary).
In mapping syntax, all keys are variable names and their values are variable values. To
use variable templates, you must use list syntax. List syntax requires you to specify
whether you're mentioning a variable ( name ), a variable group ( group ), or a template
( template ).
You can't use list and mapping variables in the same variables section, but you can
combine name , group , and template when using list syntax.
YAML
variables:
string: string # Name/value pairs
None.
Examples
For a simple set of hard-coded variables, use this mapping syntax:
YAML
YAML
variables: # pipeline-level
MY_VAR: 'my value'
ANOTHER_VAR: 'another value'
stages:
- stage: Build
variables: # stage-level
STAGE_VAR: 'that happened'
jobs:
- job: FirstJob
variables: # job-level
JOB_VAR: 'a job var'
steps:
- script: echo $(MY_VAR) $(STAGE_VAR) $(JOB_VAR)
YAML
List types
Type Description
Examples
To include variable groups, switch to this sequence syntax:
YAML
variables:
- name: string # name of a variable
value: string # value of the variable
- group: string # name of a variable group
YAML
variables:
- name: myReadOnlyVar
value: myValue
readonly: true
Sequence syntax:
YAML
variables:
- name: MY_VARIABLE # hard-coded value
value: some value
- group: my-variable-group-1 # variable group
- group: my-variable-group-2 # another variable group
See also
Add & use variable groups
Define variables
variables.group definition
Article • 07/10/2023
YAML
variables:
- group: string # Required as first property. Variable group name.
Properties
group string. Required as first property.
Examples
YAML
variables:
- group: my-variable-group
YAML
variables:
- group: my-variable-group
- name: my-bare-variable
value: 'value of my-bare-variable'
See also
Add & use variable groups
Define variables
variables.name definition
Article • 07/10/2023
YAML
variables:
- name: string # Required as first property. Variable name.
value: string # Variable value.
readonly: boolean # Whether a YAML variable is read-only; default is
false.
Properties
name string. Required as first property.
Variable name.
value string.
Variable value.
readonly boolean.
Remarks
If you want to reference a variable group and define variables in the same variables
section, you must use define your variables using name and full syntax.
Examples
YAML
variables:
- name: one
value: initialValue
- name: two
value: value2
See also
Add & use variable groups
Define variables
variables.template definition
Article • 08/10/2023
You can define a set of variables in one file and use it multiple times in other files.
YAML
variables:
- template: string # Required as first property. Template file with
variables.
parameters: # Parameters to map into the template.
Properties
template string. Required as first property.
Examples
In this example, a set of variables is repeated across multiple pipelines. The variables are
specified only once.
YAML
# File: variables/build.yml
variables:
- name: vmImage
value: vs2017-win2016
- name: arch
value: x64
- name: config
value: debug
YAML
# File: component-x-pipeline.yml
variables:
- template: variables/build.yml # Template reference
pool:
vmImage: ${{ variables.vmImage }}
steps:
- script: build x ${{ variables.arch }} ${{ variables.config }}
YAML
# File: component-y-pipeline.yml
variables:
- template: variables/build.yml # Template reference
pool:
vmImage: ${{ variables.vmImage }}
steps:
- script: build y ${{ variables.arch }} ${{ variables.config }}
See also
Template usage reference
Template parameters
Define variables
boolean definition
Article • 07/10/2023
YAML
Azure pipelines uses any of the previous string values to represent a boolean value in a
pipeline.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
YAML
deployHook:
steps: [ task | script | powershell | pwsh | bash | checkout | download |
downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where deploy steps will run.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
steps steps.
pool pool.
YAML
includeExcludeFilters:
include: [ string ] # List of items to include.
exclude: [ string ] # List of items to exclude.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
include string list.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Implementations
Implementation Description
YAML
List types
Type Description
includeExcludeStringFilters:
include: [ string ]
exclude: [ string ]
Properties
include string list.
YAML
mountReadOnly:
work: boolean # Mount the work directory as readonly.
externals: boolean # Mount the externals directory as readonly.
tools: boolean # Mount the tools directory as readonly.
tasks: boolean # Mount the tasks directory as readonly.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
work boolean.
externals boolean.
tools boolean.
tasks boolean.
YAML
onFailureHook:
steps: [ task | script | powershell | pwsh | bash | checkout | download |
downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where post on failure steps will run.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
steps steps.
pool pool.
YAML
onSuccessHook:
steps: [ task | script | powershell | pwsh | bash | checkout | download |
downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where on success steps will run.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
steps steps.
pool pool.
YAML
onSuccessOrFailureHook:
failure: # Runs on failure of any step.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where post on failure steps will run.
success: # Runs on success of all of the steps.
steps: [ task | script | powershell | pwsh | bash | checkout | download
| downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where on success steps will run.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
failure onFailureHook.
success onSuccessHook.
Used to run the steps after the traffic is routed. Typically, these tasks monitor the health
of the updated version for defined interval.
YAML
postRouteTrafficHook:
steps: [ task | script | powershell | pwsh | bash | checkout | download |
downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where post route traffic steps will run.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
steps steps.
pool pool.
Used to run steps that initialize resources before application deployment starts.
YAML
preDeployHook:
steps: [ task | script | powershell | pwsh | bash | checkout | download |
downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where pre deploy steps will run.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
steps steps.
pool pool.
Used to run steps that serve the traffic to the updated version.
YAML
routeTrafficHook:
steps: [ task | script | powershell | pwsh | bash | checkout | download |
downloadBuild | getPackage | publish | template | reviewApp ] # A list of
steps to run.
pool: string | pool # Pool where route traffic steps will run.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
steps steps.
pool pool.
YAML
workspace:
clean: string # Which parts of the workspace should be scorched before
fetching.
7 Note
This definition is a supporting definition and is not intended for use directly in a
pipeline. This article provides the YAML syntax for this supporting type, but does
not show usage examples. For more information on using the definitions that this
type supports, see the following definition links.
Properties
clean string.
Which parts of the workspace should be scorched before fetching. outputs | resources |
all.