Table of Contents

Search

  1. Preface
  2. Introduction to Informatica Big Data Management
  3. Connections
  4. Mappings in the Hadoop Environment
  5. Mapping Objects in the Hadoop Environment
  6. Processing Hierarchical Data on the Spark Engine
  7. Stateful Computing on the Spark Engine
  8. Monitoring Mappings in the Hadoop Environment
  9. Mappings in the Native Environment
  10. Profiles
  11. Native Environment Optimization
  12. Data Type Reference
  13. Complex File Data Object Properties
  14. Function Reference
  15. Parameter Reference

Complex File Data Objects Overview

Complex File Data Objects Overview

A complex file data object is a representation of a file in the Hadoop Distributed File System (HDFS). Create a complex file data object to read data from or write data to complex files, such as Avro, JSON, and Parquet, in the HDFS.
You can read from complex files on a local system or in the HDFS. Similarly, you can write to complex files on a local system or in the HDFS. To access files in the HDFS, you must use an HDFS connection. When you use an HDFS connection, you can create a complex file data object to read data from a directory of files that have the same format and properties. You can read from and write to compressed complex files.
When you create a complex file data object, the Developer tool creates a read and write operation. Use the complex file data object read operation as a source in mappings and mapplets. Use the complex file data object write operation as a target in mappings and mapplets.


Updated November 09, 2018