Table of Contents

Search

  1. Preface
  2. Introduction to PowerExchange for HDFS
  3. PowerExchange for HDFS Configuration
  4. HDFS Connections
  5. HDFS Data Objects
  6. HDFS Data Extraction
  7. HDFS Data Load
  8. HDFS Mappings
  9. Data Type Reference

PowerExchange for HDFS User Guide

PowerExchange for HDFS User Guide

HDFS Mappings Overview

HDFS Mappings Overview

After you create a flat file or a complex file data object operation, you can create an HDFS mapping.
You can define the following objects in an HDFS mapping:
  • A flat file data object or a complex file data object read operation as the input to read data from HDFS
  • Transformations
  • A flat file data object or a complex file data object write operation as the output to write data to HDFS
If you use a complex file data object as a source, you must use a Data Processor transformation to parse the file. Similarly, when you use a complex file data object as a target, you must use a Data Processor transformation to convert the source data into a binary format. You can then use the binary stream to write data to the complex file.
You can use complex file sources and targets as dynamic sources and targets in a mapping. For information about dynamic mappings, see the
Informatica Developer Mapping Guide
.
Validate and run the mapping. You can deploy the mapping and run it or add the mapping to a Mapping task in a workflow and run the workflow. You can also run the mapping in a Hadoop run-time environment.

0 COMMENTS

We’d like to hear from you!