As a DevOps administrator or developer, you can create custom scripts that extend the capability of Code Stream.

With your script, you can integrate Code Stream with your own Continuous Integration (CI) and Continuous Delivery (CD) tools and APIs that build, test, and deploy your applications. Custom scripts are especially useful if you do not expose your application APIs publicly.

Your custom script can do almost anything you need for your build, test, and deploy tools integrate with Code Stream. For example, your script can work with your pipeline workspace to support continuous integration tasks that build and test your application, and continuous delivery tasks that deploy your application. It can send a message to Slack when a pipeline finishes, and much more.

The Code Stream pipeline workspace supports Docker and Kubernetes for continuous integration tasks and custom tasks.

For more information about configuring the workspace, see Configuring the Pipeline Workspace.

You write your custom script in one of the supported languages. In the script, you include your business logic, and define inputs and outputs. Output types can include number, string, text, and password. You can create multiple versions of a custom script with different business logic, input, and output.

The scripts that you create reside in your Code Stream instance. You can import YAML code to create a custom integration or export your script as a YAML file to use in another Code Stream instance.

You have your pipeline run a released version of your script in a custom task. If you have multiple released versions, you can set one of them as latest so that it appears with latest --> when you select the custom task.

When a pipeline uses a custom integration, if you attempt to delete the custom integration, an error message appears and indicates that you cannot delete it.

Deleting a custom integration removes all versions of your custom script. If you have an existing pipeline with a custom task that uses any version of the script, that pipeline will fail. To ensure that existing pipelines do not fail, you can deprecate and withdraw the version of your script that you no longer want used. If no pipeline is using that version, you can delete it.

Table 1. What you do after you write your custom script
What you do... More information about this action...

Add a custom task to your pipeline.

The custom task:

  • Runs on the same container as other CI tasks in your pipeline.
  • Includes input and output variables that your script populates before the pipeline runs the custom task.
  • Supports multiple data types and various types of meta data that you define as inputs and outputs in your script.

Select your script in the custom task.

You declare the input and output properties in the script.

Save your pipeline, then enable and run it.

When the pipeline runs, the custom task calls the version of the script specified and runs the business logic in it, which integrates your build, test, and deploy tool with Code Stream.

After your pipeline runs, look at the executions.

Verify that the pipeline delivered the results you expected.

When you use a custom task that calls a Custom Integration version, you can include custom environment variables as name-value pairs on the pipeline Workspace tab. When the builder image creates the workspace container that runs the CI task and deploys your image, Code Stream passes the environment variables to that container.

For example, when your Code Stream instance requires a Web proxy, and you use a Docker host to create a container for a custom integration, Code Stream runs the pipeline and passes the Web proxy setting variables to that container.

Table 2. Example environment variable name-value pairs
Name Value
NO_PROXY, *.dept.vsphere.local
no_proxy, *.dept.vsphere.local
PATH /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

Name-value pairs appear in the user interface like this:

Code Stream passes the environment variables to the container that the builder image creates.

This example creates a custom integration that connects Code Stream to your Slack instance, and posts a message to a Slack channel.


  • To write your custom script, verify that you have one of these languages: Python 2, Python 3, Node.js, or any of the shell languages: Bash, sh, or zsh.
  • Generate a container image by using the installed Node.js or the Python runtime.


  1. Create the custom integration.
    1. Click Custom Integrations > New, and enter a relevant name.
    2. Select the preferred runtime environment.
    3. Click Create.
      Your script opens, and displays the code, which includes the required runtime environment. For example, runtime: "nodejs". The script must include the runtime, which the builder image uses, so that the custom task that you add to your pipeline succeeds when the pipeline runs. Otherwise, the custom task fails.
    The main areas of your custom integration YAML include the runtime, code, input properties, and output properties. This procedure explains various types and syntax.
    Custom integration YAML keys Description

    Task runtime environment where Code Stream runs the code, which can be one of these case-insensitive strings:

    • nodejs
    • python2
    • python3
    • shell

    If nothing is provided, shell is the assumed default.

    code Custom business logic to run as part of the custom task.
    inputProperties Array of input properties to capture as part of the custom task configuration. These properties are normally used in the code.
    outputProperties Array of output properties you can export from the custom task to propagate to the pipeline.
  2. Declare the input properties in your script by using the available data types and meta data.
    The input properties are passed in as context to your script in the code: section of the YAML.
    Custom task YAML input keys Description Required
    type Types of input to render:
    • text
    • textarea
    • number
    • checkbox
    • password
    • select
    name Name or string of the input to the custom task, which gets injected into the custom integration YAML code. Must be unique for each input property defined for a custom integration. Yes
    title Text string label of the input property for the custom task on the pipeline model canvas. If left empty, name is used by default. No
    required Determines whether a user must enter the input property when they configure the custom task. Set to true or false. When true, if a user does not provide a value when they configure the custom task on the pipeline canvas, the state of the task remains as unconfigured. No
    placeHolder Default text for the input property entry area when no value is present. Maps to the html placeholder attribute. Only supported for certain input property types. No
    defaultValue Default value that populates the input property entry area when the custom task renders on the pipeline model page. No
    bindable Determines whether the input property accepts dollar sign variables when modeling the custom task on the pipeline canvas. Adds the $ indicator next to the title. Only supported for certain input property types. No
    labelMessage String that acts as a help tooltip for users. Adds a tooltip icon i next to the input title. No
    enum Takes in an array of values that displays the select input property options. Only supported only for certain input property types.

    When a user selects an option, and saves it for the custom task, the value of inputProperty corresponds to this value and appears in the custom task modeling.

    For example, the value 2015.

    • 2015
    • 2016
    • 2017
    • 2018
    • 2019
    • 2020
    options Takes in an array of objects by using optionKey and optionValue.
    • optionKey. Value propagated to the code section of the task.
    • optionValue. String that displays the option in the user interface.

    Only supported only for certain input property types.


    optionKey: key1. When selected and saved for the custom task, the value of this inputProperty corresponds to key1 in the code section.

    optionValue: 'Label for 1'. Display value for key1 in the user interface, and does not appear anywhere else for the custom task.

    optionKey: key2

    optionValue: 'Label for 2'

    optionKey: key3

    optionValue: 'Label for 3'

    minimum Takes in a number that acts as the minimum value that is valid for this input property. Only supported for number type input property. No
    maximum Takes in a number that acts as the maximum value that is valid for this input property. Only supported for number type input property. No
    Table 3. Supported data types and meta data for custom scripts
    Supported data types Supported meta data for input
    • String
    • Text
    • List: as a list of any type
    • Map: as map[string]any
    • Secure: rendered as password text box, encrypted when you save the custom task
    • Number
    • Boolean: appears as text boxes
    • URL: same as string, with additional validation
    • Selection, radio button
    • type: One of String | Text ...
    • default: Default value
    • options: List or a map of options, to be used with selection or radio button
    • min: Minimum value or size
    • max: Maximum value or size
    • title: Detailed name of the text box
    • placeHolder: UI placeholder
    • description: Becomes a tool tip
    For example:
            - name: message
              type: text
              title: Message
              placeHolder: Message for Slack Channel
              defaultValue: Hello Slack
              bindable: true
              labelInfo: true
              labelMessage: This message is posted to the Slack channel link provided in the code
  3. Declare the output properties in your script.
    The script captures output properties from the business logic code: section of your script, where you declare the context for the output.
    When the pipeline runs, you can enter the response code for the task output. For example, 200.
    Keys that Code Stream supports for each outputProperty.
    key Description
    type Currently includes a single value of label.
    name Key that the code block of the custom integration YAML emits.
    title Label in the user interface that displays outputProperty.
    For example:
      - name: statusCode
        type: label
        title: Status Code
  4. To interact with the input and output of your custom script, get an input property or set an output property by using context.
    For an input property: (context.getInput("key"))
    For an output property: (context.setOutput("key", "value"))
    For Node.js:
    var context = require("./context.js")
    var message = context.getInput("message");
    //Your Business logic
    context.setOutput("statusCode", 200);
    For Python:
    from context import getInput, setOutput
    message = getInput('message')
    //Your Business logic
    setOutput('statusCode', '200')
    For Shell:
    # Input, Output properties are environment variables
    echo ${message} # Prints the input message
    //Your Business logic
    export statusCode=200 # Sets output property statusCode
  5. In the code: section, declare all the business logic for your custom integration.
    For example, with the Node.js runtime environment:
    code: |
        var https = require('https');
        var context = require("./context.js")
        //Get the entered message from task config page and assign it to message var
        var message = context.getInput("message");
        var slackPayload = JSON.stringify(
                text: message
        const options = {
            hostname: '',
            port: 443,
            path: '/YOUR_SLACK_WEBHOOK_PATH',
            method: 'POST',
            headers: {
                'Content-Type': 'application/json',
                'Content-Length': Buffer.byteLength(slackPayload)
        // Makes a https request and sets the output with statusCode which 
        // will be displayed in task result page after execution
        const req = https.request(options, (res) => {
            context.setOutput("statusCode", res.statusCode);
        req.on('error', (e) => {
  6. Before you version and release your custom integration script, download the context file for Python or Node.js and test the business logic that you included in your script.
    1. Place the pointer at the top of the canvas, then click the context file button. For example, if your script is in Python click CONTEXT.PY.
    2. Modify the file and save it.
    3. On your development system, run and test your custom script with the help of the context file.
  7. Apply a version to your custom integration script.
    1. Click Version.
    2. Enter the version information.
    3. Click Release Version so that you can select the script in your custom task.
    4. To create the version, click Create.
      You version your custom integration script and select the version in the Custom task in your pipeline.
  8. (Optional) You can set any released version of a custom integration script as the latest so that the version appears with the latest --> label on the pipeline canvas.
    1. Place the pointer at the top of the canvas, then click Version History.
    2. To see available actions, click the horizontal ellipsis for the version that you want and select Set As Latest.
      Note: Only released versions appear with the Set As Latest action.
      After you version and release a custom integration script, you can set the version as latest so that a user knows the current version to select in their pipeline.
    3. To confirm the version selection, click Set As Latest.
    4. To exit Version History and return to the script editor canvas, click the back arrow.
  9. To save the script, click Save.
    To export your script as a YAML file to use in another Code Stream instance, click Actions > Export on the custom integration card and select the versions to export.
  10. In your pipeline, configure the workspace.
    This example uses a Docker workspace.
    1. Click the Workspace tab.
    2. Select the Docker host and the builder image URL.
      When you create a custom integration, you include the host, builder image URL, and image registry.
  11. Add a custom task to your pipeline, and configure it.
    1. Click the Model tab.
    2. Add a task, select the type as Custom, and enter a relevant name.
    3. Select your custom integration script and version. If a version of the script has been set as latest, that version appears with latest --> before the version name.
    4. To display a custom message in Slack, enter the message text.
      Any text you enter overrides the defaultValue in your custom integration script.
      When you add a custom task in your pipeline, you select a version of your custom script.
  12. Save and enable your pipeline.
    1. Click Save.
    2. On the pipeline card, click Actions > Enable.
  13. Run your pipeline.
    1. Click Run.
    2. Look at the pipeline execution.
    3. Confirm that the output includes the expected status code, response code, status, and declared output.
      You defined statusCode as an output property. For example, a statusCode of 200 might indicate a successful Slack post, and a responseCode of 0 might indicate that the script succeeded without error.
    4. To confirm the output in the execution logs, click Executions, click the link to your pipeline, click the task, and look at the logged data. For example:
      After the pipeline runs your custom task, you can viewing the task output for the custom integration in the pipeline executions.
  14. If an error occurs, troubleshoot the problem and run the pipeline again.
    For example, if a file or module in the base image is missing, you must create another base image that includes the missing file. Then, provide the Docker file, and push the image through the pipeline.


Congratulations! You created a custom integration script that connects Code Stream to your Slack instance, and posts a message to a Slack channel.

What to do next

Continue to create custom integrations to support using custom tasks in your pipelines, so that you can extend the capability of Code Stream in the automation of your software release lifecycle.