Table of Contents

Search

  1. Preface
  2. Introduction to Data Engineering Streaming
  3. Data Engineering Streaming Administration
  4. Sources in a Streaming Mapping
  5. Targets in a Streaming Mapping
  6. Streaming Mappings
  7. Transformation in Streaming Mappings
  8. Window Transformation
  9. Appendix A: Connections
  10. Appendix B: Monitoring REST API Reference
  11. Appendix C: Sample Files

Connections Overview

Connections Overview

Define the connections that you want to use to access data in Kafka brokers, JMS servers, HDFS files, Hive tables, Amazon Kinesis streams, MapR streams, or HBase resources. You can create the connections using the Developer tool and infacmd.
You can create the following types of connections:
Amazon S3
Create an Amazon S3 connection to write data to Amazon S3.
Databricks
Create a Databricks connection to run mappings in the Databricks environment. For information about Databricks Cloud Provisioning, see the
Data Engineering Integration User Guide
.
Hadoop
Create a Hadoop connection to run mappings on the Hadoop cluster. Select the Hadoop connection if you select the Hadoop run-time environment. You must also select the Hadoop connection to validate a mapping to run on the Hadoop cluster.
For more information about the Hadoop connection properties, see the
Data Engineering Integration User Guide
.
HBase
Create an HBase connection to write data to an HBase resource.
HDFS
Create an HDFS connection to write data to an HDFS binary or sequence file.
Hive
Create a Hive connection to write data to Hive tables.
For more information, see the
Data Engineering Administrator Guide
.
JDBC
Create a JDBC connection when you perform a lookup on a relational database using Sqoop.
For more information about the JDBC connection properties, see the
Data Engineering Integration User Guide
.
Microsoft Azure Data Lake Store
Create a Microsoft Azure Data Lake Store connection to write to a Microsoft Azure Data Lake Store.
Messaging
Create a Messaging connection to access data as it becomes available, and to run a streaming mapping on a Spark engine. You can create the following types of messaging connections:
  • Amazon Kinesis. Create an Amazon Kinesis connection to read from Amazon Kinesis Streams or write to Amazon Kinesis Firehose Delivery Streams.
  • Azure Event Hub. Create an Azure Event Hub connection to read from or write to Microsoft Event Hubs.
  • Confluent Kafka. Create a Confluent Kafka connection to read from or write to a Confluent Kafka broker.
  • JMS. Create a JMS connection to read from or write to a JMS server.
  • Kafka. Create a Kafka connection to read from or write to a Kafka broker.
  • MapR Streams. Create a MapR Streams connection read from or write to MapR Streams.

0 COMMENTS

We’d like to hear from you!