After you create a PowerExchange for Hadoop mapping in the Designer, you create a PowerExchange for Hadoop session in the Workflow Manager to read, transform, and write Hadoop data.
Before you create a session, configure a Hadoop HDFS application connection to connect to the HDFS host. When the Integration Service extracts or loads Hadoop data, it connects to a Hadoop cluster through the HDFS host that runs the name node service for a Hadoop cluster.
If the mapping contains a flat file source, you can configure the session to extract data from HDFS. If the mapping contains a flat file target, you can configure the session to load data to HDFS or a Hive table.
When the Integration Service loads data to a Hive table, it first loads data to HDFS. The Integration Service then generates an SQL statement to create the Hive table and load the data from HDFS to the table.