LogoLogo
  • Overview
  • publisher
    • Introduction
    • Getting Started
      • Logging in to Publisher
    • Data Sources
      • Connecting a Data Source
      • Managing a Data Source
      • Connectors
        • AWS S3 Permissions
        • Connecting to AWS S3 Storage
        • Google Cloud Storage (GCS) Permissions
        • Connecting to Google Cloud Storage
        • PostgreSQL Permissions
        • Connecting to PostgreSQL
        • PostgreSQL on Azure Permissions
        • Microsoft Azure Blob Storage Permissions
        • Connecting to Microsoft Azure Blob Storage
        • Connecting to HTTPS
        • Connecting to other sources via Trino
          • BigQuery
    • Collections
      • Creating a Collection
      • Sharing a Collection
      • Collection Filters
      • Editing Collection Metadata
      • Updating Collection Contents
    • Access Policies
      • Creating an Access Policy
      • Managing Access Policies
    • Questions
      • Adding Questions
      • Example Question
    • Settings
      • Viewing Current and Past Administrators
      • Adding an Administrator
      • Removing an Administrator
      • Setting Notification Preferences
  • Explorer
    • Introduction
    • Viewing a Collection
    • Browsing Collections
    • Asking Questions
    • Accessing a Private Collection
      • Requesting Access to a Private Collection
    • Filtering Data in Tables
      • Strings
      • Dates
      • Numbers
  • Workbench
    • Introduction
    • Getting Started
      • Logging into Workbench
      • Connecting an Engine
      • Finding or Importing a Workflow
      • Configuring Workflow Inputs
      • Running and Monitoring a Workflow
      • Locating Outputs
    • Engines
      • Adding and Updating an Engine
        • On AWS HealthOmics
        • On Microsoft Azure
        • On Google Cloud Platform
        • On Premises
      • Parameters
        • AWS HealthOmics
        • Google Cloud Platform
        • Microsoft Azure
        • On-Premises
        • Cromwell
        • Amazon Genomics CLI
    • Workflows
      • Finding Workflows
      • Adding a Workflow
      • Supported Languages
      • Repositories
        • Dockstore
    • Instruments
      • Getting Started with Instruments
      • Connecting a Storage Account
      • Using Sample Data in a Workflow
      • Running Workflows Using Samples
      • Family Based Analysis with Pedigree Information
      • Monitor the Workflow
      • CLI Reference
        • Instruments
        • Storage
        • Samples
        • OpenAPI Specification
    • Entities
    • Terminology
  • Passport
    • Introduction
    • Registering an Email Address for a Google Identity
  • Command Line Interface
    • Installation
    • Usage Examples
    • Working with JSON Data
    • Reference
      • workbench
        • runs submit
        • runs list
        • runs describe
        • runs cancel
        • runs delete
        • runs logs
        • runs tasks list
        • runs events list
        • engines list
        • engines describe
        • engines parameters list
        • engines parameters describe
        • engines health-checks list
        • workflows create
        • workflows list
        • workflows describe
        • workflows update
        • workflows delete
        • workflows versions create
        • workflows versions list
        • workflows versions describe
        • workflows versions files
        • workflows versions update
        • workflows versions delete
        • workflows versions defaults create
        • workflows versions defaults list
        • workflows versions defaults describe
        • workflows versions defaults update
        • workflows versions defaults delete
        • namespaces get-default
        • storage add
        • storage delete
        • storage describe
        • storage list
        • storage update
        • storage platforms add
        • storage platforms delete
        • storage platforms describe
        • storage platforms list
        • samples list
        • samples describe
        • samples files list
      • publisher
        • datasources list
  • Analysis
    • Python Library
    • Popular Environments
      • Cromwell
      • CWL Tool
      • Terra
      • Nextflow
      • DNAnexus
Powered by GitBook

© DNAstack. All rights reserved.

On this page
  • Prerequisites
  • Initial Setup
  • Get the Example Workflow
  • Importing a workflow
  • Running a workflow

Was this helpful?

  1. Analysis
  2. Popular Environments

Terra

PreviousCWL ToolNextNextflow

Last updated 3 months ago

Was this helpful?

Terra is a fully managed bioinformatics platform powered by Google Cloud Platform and Cromwell, with native support for WDL workflows. This guide demonstrates running the 02_download_collection_files.wdl workflow in Terra, typically taking 5-10 minutes to complete.

Prerequisites

Before starting, you'll need:

  • (includes $300 free compute credit for 90-day trial)

  • Optional:

Refer to Terra's guide for linking your GCP account and billing information.

Initial Setup

Get the Example Workflow

Download the worked examples repository containing 02_download_collection_files.wdl:

Clone with Git

git clone https://github.com/DNAstack/dnastack-client-library-worked-examples.git

Or and Unzip

Create a workspace that will house the workflow and output files. A cloud environment is not needed to run WDL workflows and can be skipped when prompted.

[block:image] { "images": [ { " image": [ "https://files.readme.io/e6eacb1-create-workspace1.gif", "create-workspace1.gif", 1920, 969, "#000000" ], " border": true, "sizing": "full" } ] } [/block]

Importing a workflow

  • Navigate to "Workflows" and select "Find a Workflow"

  • Select "Broad Methods Repository" and log in with your Google account

  • Choose "Create New Method"

  • Upload 02_download_collection_files.wdl or paste its contents

  • Export to your Terra workspace using Blank Configuration

Running a workflow

  • Return to your workspace and select your workflow

  • Choose "Run workflow with inputs defined by file paths"

  • Click "Run Analysis" to begin execution

  • Monitor progress in the "Job History" tab

  • Access completed files in the "Data" tab under Files

Downloading files from Terra's Google Cloud Bucket may incur charges based on file size.

Google Account
Google Cloud Platform Account
Terra Account
git
getting started
Download