When you configure your pipeline, you add specific types of tasks that the pipeline runs for the actions you need. Each task type integrates with another application and enables your pipeline as it builds, tests, and delivers your applications.

To run your pipeline, whether you must pull artifacts from a repository for deployment, run a remote script, or require approval on a user operation from a team member, Automation Pipelines has the type of task for you!

Automation Pipelines supports canceling a pipeline run on various types of tasks. When you click Cancel on a pipeline execution, the task, stage, or entire pipeline enters the canceling state and cancels the pipeline run.

Automation Pipelines allows you to cancel the pipeline run on a task, stage, or the entire pipeline when using these tasks:
  • Jenkins
  • SSH
  • PowerShell
  • User Operation
  • Pipeline
  • Cloud template
  • vRO
  • POLL

Automation Pipelines does not propagate the cancel behavior to third-party systems for these tasks: CI, Custom Integration, or Kubernetes. Automation Pipelines marks the task as canceled and immediately stops fetching the status without waiting for the task to finish. The task might complete or fail on the third-party system but immediately stops running in Automation Pipelines when you click Cancel.

Before you use a task in your pipeline, verify that the corresponding endpoint is available.

Table 1. Obtain an approval or set a decision point
Type of task What it does Examples and details
User Operation

A User Operation task enables a required approval that controls when a pipeline runs and must stop for an approval.

See How do I run a pipeline and see results. and How do I manage user access and approvals in Automation Pipelines.

Condition

Adds a decision point, which determines whether the pipeline continues to run, or stops, based on condition expressions. When the condition is true, the pipeline runs successive tasks. When false, the pipeline stops.

See How do I use variable bindings in a condition task to run or stop a pipeline in Automation Pipelines.

Table 2. Automate continuous integration and deployment
Type of task What it does Examples and details
Cloud template

Deploys an automation cloud template from GitHub and provisions an application, and automates the continuous integration and continuous delivery (CICD) of that cloud template for your deployment.

See How do I automate the release of an application that I deploy from a YAML cloud template in Automation Pipelines.
CI

The CI task enables continuous integration of your code into your pipeline by pulling a Docker build image from a registry endpoint, and deploying it to a Kubernetes cluster.

The CI task displays 100 lines of the log as output, and displays 500 lines when you download the logs.

The CI tasks requires ephemeral ports 32768 to 61000.

See Planning a CICD native build in Automation Pipelines before using the smart pipeline template.

Custom

The Custom task integrates Automation Pipelines with your own build, test, and deploy tools.

See How do I integrate my own build, test, and deploy tools with Automation Pipelines.

Kubernetes

Automate the deployment of your software applications to Kubernetes clusters.

See How do I automate the release of an application in Automation Pipelines to a Kubernetes cluster.

Pipeline

Nests a pipeline in a primary pipeline. When a pipeline is nested, it behaves as a task in the primary pipeline.

On the Task tab of the primary pipeline, you can easily navigate to the nested pipeline by clicking the link to it. The nested pipeline opens in a new browser tab.

To find nested pipelines in Executions, enter nested in the search area.

Table 3. Integrate development, test, and deployment applications
Task type... What it does... Examples and details...
Bamboo

Interacts with a Bamboo continuous integration (CI) server, which continuously builds, tests, and integrates software in preparation for deployment, and triggers code builds when developers commit changes. It exposes the artifact locations that the Bamboo build produces so that the task can output the parameters for other tasks to use for build and deployment.

Connect to a Bamboo server endpoint and start a Bamboo build plan from your pipeline.

Jenkins

Triggers Jenkins jobs that build and test your source code, runs test cases, and can use custom scripts.

See How do I integrate Automation Pipelines with Jenkins.

TFS

Allows you to connect your pipeline to Team Foundation Server to manage and invoke build projects, including configured jobs that build and test your code.

For versions of Team Foundation Server that Automation Pipelines supports, see What are Endpoints in Automation Pipelines.

vRO

Extends the capability of Automation Pipelines by running predefined or custom workflows in VMware Aria Automation Orchestrator.

Automation Pipelines supports basic authentication and token-based authentication for VMware Aria Automation Orchestrator. Automation Pipelines uses the API token to authenticate and validate the VMware Aria Automation Orchestrator cluster. With token-based authentication, Automation Pipelines supports VMware Aria Automation Orchestrator endpoints that use a Cloud Extensibility Proxy. As a result, in Automation Pipelines you can trigger workflows with a VMware Aria Automation Orchestrator endpoint that uses the Cloud Extensibility Proxy.

See How do I integrate Automation Pipelines with VMware Aria Automation Orchestrator.

Table 4. Integrate other applications through an API
Task type... What it does... Examples and details...
REST

Integrates Automation Pipelines with other applications that use a REST API so that you can continuously develop and deliver software applications that interact with each other.

See How do I use a REST API to integrate Automation Pipelines with other applications.

Poll

Invokes a REST API and polls it until the pipeline task meets the exit criteria and completes.

A Automation Pipelines administrator can set the poll count to a maximum of 10000. The poll interval must be greater than or equal to 60 seconds.

When you mark the Continue on failure check box, if the count or interval exceeds these values, the poll task continues to run.

POLL Iteration Count: Appears in the pipeline execution and displays the number of times the POLL task requested a response from the URL. For example, if the POLL input is 65 and the actual times the POLL request ran is 4, the iteration count in the pipeline execution output would display 4 (out of 65).

See How do I use a REST API to integrate Automation Pipelines with other applications.

Table 5. Run remote and user-defined scripts
Type of task What it does Examples and details
PowerShell

With the PowerShell task, Automation Pipelinescan run script commands on a remote host. For example, a script can automate test tasks, and run administrative types of commands.

The script can be remote or user-defined. It can connect over HTTP or HTTPS, and can use TLS.

The Windows host must have the winrm service configured, and winrm must have MaxShellsPerUser and MaxMemoryPerShellMB configured.

To run a PowerShell task, you must have an active session to the remote Windows host.

PowerShell Command Line Length

If you enter a base64 PowerShell command, be aware that you must calculate the overall command length.

The Automation Pipelines pipeline encodes and wraps a base64 PowerShell command in another command, which increases the overall length of the command.

The maximum length allowed for a PowerShell winrm command is 8192 bytes. The command length limit is lower for the PowerShell task when it is encoded and wrapped. As a result, you must calculate the command length before you enter the PowerShell command.

The command length limit for the Automation Pipelines PowerShell task depends on the base64 encoded length of the original command. The command length is calculated as follows.

3 * (length of original command / 4)) - (numberOfPaddingCharacters) + 77 (Length of Write-output command)

The command length for Automation Pipelines must be less than the maximum limit of 8192.

When you configure MaxShellsPerUser and MaxMemoryPerShellMB:

  • The acceptable value for MaxShellsPerUser is 500 for 50 concurrent pipelines, with 5 PowerShell tasks for each pipeline. To set the value, run: winrm set winrm/config/winrs '@{MaxShellsPerUser="500"}'
  • The acceptable memory value for MaxMemoryPerShellMB is 2048. To set the value, run: winrm set winrm/config/winrs '@{MaxMemoryPerShellMB="2048"}'

The script writes the output to a response file that another pipeline can consume.

SSH

The SSH task allows the Bash shell script task to run script commands on a remote host. For example, a script can automate test tasks, and run administrative types of commands.

The script can be remote or user-defined. It can connect over HTTP or HTTPS, and requires a private key or password.

The SSH service must be configured on the Linux host, and the SSHD configuration of MaxSessions must be set to 50.

If you run many SSH tasks concurrently, increase the MaxSessions and MaxOpenSessions on the SSH host. Do not use your VMware Aria Automation instance as the SSH host if you need to modify the MaxSessions and MaxOpenSessions configuration settings.

The SSH task does not support OpenSSH type private keys. Generate the public/private key pair using one of the following methods:
  • On a Windows machine , use PuTTYgen to generate the key pair.
  • On a Mac or Linux machine, use ssh -V to verify that the SSH version is earlier than 7.8, then use ssh -keygen to generate the key pair in a terminal window.
Note: Verify that the generated key does not appear with BEGIN OPENSSH PRIVATE KEY.
If the generated public key is an authorized key in the remote machine, refer to one of the following articles to change the OpenSSH private format:

When configuring the SSH task, the private key must be entered in plain text. Saving the key as a variable or input changes the key format and the pipeline task will fail to run.

The script can be remote or user-defined. For example, a script might resemble:

message="Hello World" echo $message

The script writes the output to a response file that another pipeline can consume.