LogoLogo
  • Overview
  • publisher
    • Introduction
    • Getting Started
      • Logging in to Publisher
    • Data Sources
      • Connecting a Data Source
      • Managing a Data Source
      • Connectors
        • AWS S3 Permissions
        • Connecting to AWS S3 Storage
        • Google Cloud Storage (GCS) Permissions
        • Connecting to Google Cloud Storage
        • PostgreSQL Permissions
        • Connecting to PostgreSQL
        • PostgreSQL on Azure Permissions
        • Microsoft Azure Blob Storage Permissions
        • Connecting to Microsoft Azure Blob Storage
        • Connecting to HTTPS
        • Connecting to other sources via Trino
          • BigQuery
    • Collections
      • Creating a Collection
      • Sharing a Collection
      • Collection Filters
      • Editing Collection Metadata
      • Updating Collection Contents
    • Access Policies
      • Creating an Access Policy
      • Managing Access Policies
    • Questions
      • Adding Questions
      • Example Question
    • Settings
      • Viewing Current and Past Administrators
      • Adding an Administrator
      • Removing an Administrator
      • Setting Notification Preferences
  • Explorer
    • Introduction
    • Viewing a Collection
    • Browsing Collections
    • Asking Questions
    • Accessing a Private Collection
      • Requesting Access to a Private Collection
    • Filtering Data in Tables
      • Strings
      • Dates
      • Numbers
  • Workbench
    • Introduction
    • Getting Started
      • Logging into Workbench
      • Connecting an Engine
      • Finding or Importing a Workflow
      • Configuring Workflow Inputs
      • Running and Monitoring a Workflow
      • Locating Outputs
    • Engines
      • Adding and Updating an Engine
        • On AWS HealthOmics
        • On Microsoft Azure
        • On Google Cloud Platform
        • On Premises
      • Parameters
        • AWS HealthOmics
        • Google Cloud Platform
        • Microsoft Azure
        • On-Premises
        • Cromwell
        • Amazon Genomics CLI
    • Workflows
      • Finding Workflows
      • Adding a Workflow
      • Supported Languages
      • Repositories
        • Dockstore
    • Instruments
      • Getting Started with Instruments
      • Connecting a Storage Account
      • Using Sample Data in a Workflow
      • Running Workflows Using Samples
      • Family Based Analysis with Pedigree Information
      • Monitor the Workflow
      • CLI Reference
        • Instruments
        • Storage
        • Samples
        • OpenAPI Specification
    • Entities
    • Terminology
  • Passport
    • Introduction
    • Registering an Email Address for a Google Identity
  • Command Line Interface
    • Installation
    • Usage Examples
    • Working with JSON Data
    • Reference
      • workbench
        • runs submit
        • runs list
        • runs describe
        • runs cancel
        • runs delete
        • runs logs
        • runs tasks list
        • runs events list
        • engines list
        • engines describe
        • engines parameters list
        • engines parameters describe
        • engines health-checks list
        • workflows create
        • workflows list
        • workflows describe
        • workflows update
        • workflows delete
        • workflows versions create
        • workflows versions list
        • workflows versions describe
        • workflows versions files
        • workflows versions update
        • workflows versions delete
        • workflows versions defaults create
        • workflows versions defaults list
        • workflows versions defaults describe
        • workflows versions defaults update
        • workflows versions defaults delete
        • namespaces get-default
        • storage add
        • storage delete
        • storage describe
        • storage list
        • storage update
        • storage platforms add
        • storage platforms delete
        • storage platforms describe
        • storage platforms list
        • samples list
        • samples describe
        • samples files list
      • publisher
        • datasources list
  • Analysis
    • Python Library
    • Popular Environments
      • Cromwell
      • CWL Tool
      • Terra
      • Nextflow
      • DNAnexus
Powered by GitBook

© DNAstack. All rights reserved.

On this page
  • Synopsis
  • Description
  • Examples
  • Positional Arguments
  • INPUT_OVERRIDES
  • Flags
  • --url=URL
  • --workflow=WORKFLOW_ID
  • --version=WORKFLOW_VERSION
  • --engine=ENGINE_ID
  • --engine-params=JSON_DATA
  • --default-params=JSON_DATA
  • --workflow-params=JSON_DATA
  • --tags=JSON_DATA
  • --dry-run
  • Advanced Topics
  • Batching

Was this helpful?

  1. Command Line Interface
  2. Reference
  3. workbench

runs submit

Submit one or more Workflow Runs

Synopsis

omics workbench runs submit [INPUT_OVERRIDE...] 
  [--url=URL]
  [--workflow=WORKFLOW_ID]
  [--version=WORKFLOW_VERSION] 
  [--engine=ENGINE_ID]
  [--tags=JSON_DATA]
  [--default-params=JSON_DATA]
  [--engine-params=JSON_DATA]
  [--workflow-params=JSON_DATA,[--workflow-params=JSON_DATA...]]
  [--dry-run]

Description

omics workbench runs submit facilitates the submission of a workflow for execution one or more times against the same execution engine.

Examples

Submit a single workflow for execution using the default engine and inputs passed in the input overrides

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  hello.name=Suzy \
  hello.greeting=Hello

You can also provide workflow and version as an alternative to workflow url

omics workbench runs submit \
  --workflow 3bce2a53-c9f1-4b3f-8024-a849e64adb97 \
  --version hello-world \
  hello.name=Suzy \
  hello.greeting=Hello

If workflows are stored in a file called inputs.json you can use the special @ operator to load the file contents

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  @inputs.json

The above example can be modified to pass additional tag metadata. Tags are arbitrary key:value pairs that provide additional context and can be used later for filtering runs

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  --tags sample=suzy \
  hello.name=Suzy \
  hello.greeting=Hello
omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  --engine my-engine-12d
  --tags sample=suzy \
  hello.name=Suzy \
  hello.greeting=Hello

To perform a dry run without actually submitting the workflow, use the --dry-run flag

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  --dry-run \
  hello.name=Suzy \
  hello.greeting=Hello

Positional Arguments

INPUT_OVERRIDES

Flags

--url=URL

The URL to the workflow file (*.wdl). Only URLs from workflow-service are currently supported. Workflow URLs are a string that can correspond to any of the following:

  • URL from the workflows page in Workbench (ie: https://workbench.dnastack/workflows/<workflow_id>/overview)

  • Complete URL for a workflow or a workflow version from the workflow service

  • This flag is mutually exclusive with --workflow

--workflow=WORKFLOW_ID

--version=WORKFLOW_VERSION

Must be used in conjunction with --workflow to specify the version of the workflow to run.

--engine=ENGINE_ID

--engine-params=JSON_DATA

omics workbench runs submit \
--url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
--engine-params 963b7c83-0794-486c-93fd-88e77c484733,@input.json,key=val,'{"literal":"json"}'

--default-params=JSON_DATA

--workflow-params=JSON_DATA

--tags=JSON_DATA

--dry-run

Perform a dry run without actually submitting the workflow. This can be useful for validating the inputs and parameters before actual submission.

Advanced Topics

Batching

The CLI provides a powerful interface for submitting many workflows at once with the ability to define default values and an arbitrary number of runs. In the above examples, we are submitting inputs using input overrides. The simple workflow submission can be re-written using the --workflow-params flag instead of overrides.

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  --workflow-params hello.name=Suzy,hello.greeting=Hello

Additionally, if we want to guarantee that every run has a specific value defined for hello.greeting we can mix and match the --workflow-params flag with input overrides.

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  --workflow-params hello.name=Suzy \
  hello.greeting=Hello

Now imagine you want to submit the exact same workflow for Suzy, Frank, Peggie, and Brooklyn. One way would be to submit the above command 4 different times changing the hello.name=${name}, however, this increases the boilerplate needed. A simpler approach would be to define multiple --workflow-params flags in the same command. You can specify --workflow-params as many times as you want. Each time it is specified, it will be translated into a single submitted run of the given workflow.

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  --workflow-params hello.name=Suzy \
  --workflow-params hello.name=Frank \
  --workflow-params hello.name=Peggie \
  --workflow-params hello.name=Brooklyn \
  hello.greeting=Hello

Finally, say we would like to change one of the run's greetings but keep it the same for all other runs. The above command uses an override which will fix the input value for all runs. Instead, we can use the --default-params flag to specify the default values, and then in the specific submission, we want to override it, and provide a key:value input for it.

omics workbench runs submit \
  --url 3bce2a53-c9f1-4b3f-8024-a849e64adb97/hello-world \
  --default-params hello.greeting=Hello \
  --workflow-params hello.name=Suzy \
  --workflow-params hello.name=Frank \
  --workflow-params hello.name=Peggie \
  --workflow-params hello.name=Brooklyn,hello.greeting=Goodbye

The above command sets Hello as the default greeting which will be applied to all submissions except the final one which sets the greeting to Goodbye.

PreviousworkbenchNextruns list

Last updated 4 months ago

Was this helpful?

If you want to run the workflow using an engine other than the default, you can retrieve the engine id either from the web application or the command

Positional that will take precedence over default-params and workflow-params. If a value specified in the override collides with a value in the default-params or workflow-params, they will be overridden with the value provided in the override. If a key with the same name is already provided in the default-params or workflow-params, they will be overridden with the value provided in the override.

A concatenation of the <workflow_internal_id>/<version_name>. These values can be retrieved from the Workbench workflows page, or you can also retrieve them by using the or commands

The internal ID of the workflow to run. The value can be retrieved from the Workbench workflows page, or you can also retrieved by using the or commands. This is flag is mutually exclusive with --url.

Use the given engine id for the execution of runs. If this value is not defined then the default engine will be used. The Engine ID can be retrieved from the page in workbench or the command.

Set the global engine parameters for all runs that are to be submitted. Engine Parameters can be defined as a list of engine parameter preset IDs (which can be found using the ) and/or a list of . The list is expected to be comma-separated and actual engine parameters are different for each configured engine. Note that any JSON Data from a specified file will be overwritten by literal json, key-value pairs or parameter presets with matching keys. For example:

Specify the global default inputs as a JSON file or as inlined JSON to use when submitting multiple runs. Default inputs have the lowest level of precedence and will be overridden by any run input or override. Default params can be defined as .

Specify the workflow params for a given run, as key:value pairs, a JSON file, or inlined JSON. Additionally, the user may specify a directory that contains a list of parameters in JSON format. Each file will result in a separate run. This field may be repeated, with each repetition specifying a separate run request that will be submitted as part of a batch. Workflow params can be defined as .

Set the global tags for all runs that are to be submitted. Tags can be defined as .

engines list
JSON Data
workflows list
workflows describe
workflows list
workflows describe
engines
engines list
engines parameters list command
JSON Data
JSON Data
JSON Data
JSON Data