Table of Contents


  1. Preface
  2. Part 1: Hadoop Integration
  3. Part 2: Databricks Integration
  4. Appendix A: Connections



Create a connection to access non-native environments, Hadoop and Databricks. If you access HBase, HDFS, or Hive sources or targets in the Hadoop environment, you must also create those connections. You can create the connections using the Developer tool, Administrator tool, and infacmd.
You can create the following types of connections:
Hadoop connection
Create a Hadoop connection to run mappings in the Hadoop environment.
HBase connection
Create an HBase connection to access HBase. The HBase connection is a NoSQL connection.
HDFS connection
Create an HDFS connection to read data from or write data to the HDFS file system on a Hadoop cluster.
Hive connection
Create a Hive connection to access Hive as a source or target. You can access Hive as a source if the mapping is enabled for the native or Hadoop environment. You can access Hive as a target if the mapping runs on the Blaze engine.
JDBC connection
Create a JDBC connection and configure Sqoop properties in the connection to import and export relational data through Sqoop.
Databricks connection
Create a Databricks connection to run mappings in the Databricks environment.
For information about creating connections to other sources or targets such as social media web sites or Teradata, see the respective PowerExchange adapter user guide for information.


We’d like to hear from you!