Table of Contents

Search

  1. Preface
  2. Introduction to Big Data Management Administration
  3. Big Data Management Engines
  4. Authentication and Authorization
  5. Running Mappings on a Cluster with Kerberos Authentication
  6. Configuring Access to an SSL/TLS-Enabled Cluster
  7. Cluster Configuration
  8. Cluster Configuration Privileges and Permissions
  9. Cloud Provisioning Configuration
  10. Queuing
  11. Tuning for Big Data Processing
  12. Connections
  13. Multiple Blaze Instances on a Cluster

Big Data Management Administrator Guide

Big Data Management Administrator Guide

Cloud Provisioning Configuration Overview

Cloud Provisioning Configuration Overview

A cloud provisioning configuration is an object in the domain that contains information about the cloud platform. The cloud provisioning configuration gives the Data Integration Service the information it needs to create a cluster on the cloud platform.
Create a cloud provisioning configuration when you configure a cluster workflow. You create a cluster workflow to automate the creation of clusters and workflow tasks on an Amazon Web Services, Microsoft Azure, or the Databricks compute cluster.
The Data Integration Service uses the information in the cloud provisioning configuration to establish a relationship between the workflow Create Cluster task and the cloud platform, and to run tasks on the cluster that the workflow creates. Using authentication credentials from the cloud provisioning configuration, the Data Integration Service submits jobs to the compute cluster using the REST API.
The cluster connection that the cluster workflow uses contains a reference to the cloud provisioning configuration.
Consider the following high-level process for using the cloud provisioning connection:
  1. Verify prerequisites.
  2. Create the cloud provisioning configuration.
  3. Create a cluster connection for the workflow.
After you create the cloud provisioning configuration and the cluster connection, a developer uses the Developer tool to create, deploy, and run a cluster workflow.

0 COMMENTS

We’d like to hear from you!