Table of Contents

Search

  1. Preface
  2. Introduction to Data Engineering Streaming
  3. Data Engineering Streaming Administration
  4. Sources in a Streaming Mapping
  5. Targets in a Streaming Mapping
  6. Streaming Mappings
  7. Window Transformation
  8. Appendix A: Connections
  9. Appendix B: Monitoring REST API Reference
  10. Appendix C: Sample Files

Cluster Workflows

Cluster Workflows

You can run a workflow to create an ephemeral cluster that runs mapping and other tasks on a cloud platform cluster.
Create cluster workflows to run on the Amazon EMR or Microsoft Azure HDInsight cloud platforms in the Hadoop environment. Create cluster workflows to run on the Databricks cloud platform in the Databricks environment. The cluster workflow uses other elements that enable communication between the Data Integration Service and the cloud platform, such as a cloud provisioning configuration and a cluster connection.
A cluster workflow contains a Create Cluster task that you configure with information about the cluster to create. If you want to create an ephemeral cluster, you can include a Delete Cluster task. An ephemeral cluster is a cloud platform cluster that you create to run mappings and other tasks, and then terminate when tasks are complete. Create ephemeral clusters to save cloud platform resources.
For more information about cluster workflows, see the
Data Engineering Integration User Guide
.

0 COMMENTS

We’d like to hear from you!