Table of Contents

Search

  1. Preface
  2. Introduction to Informatica Big Data Management
  3. Mappings
  4. Sources
  5. Targets
  6. Transformations
  7. Data Preview
  8. Cluster Workflows
  9. Profiles
  10. Monitoring
  11. Hierarchical Data Processing
  12. Hierarchical Data Processing Configuration
  13. Hierarchical Data Processing with Schema Changes
  14. Intelligent Structure Models
  15. Stateful Computing
  16. Appendix A: Connections
  17. Appendix B: Data Type Reference
  18. Appendix C: Function Reference

Complex File Targets on ADLS

Complex File Targets on ADLS

You can use a PowerExchange for HDFS or a PowerExchange for Microsoft Azure Data Lake Store connection to write data to ADLS data objects. If you use a PowerExchange for Microsoft Azure Data Lake Store connection, you cannot write compressed binary files to ADLS.
The following table shows the complex files that a PowerExchange for Microsoft Azure Data Lake Store connection can process within ADLS storage in the Hadoop environment:
File Type
Supported Formats
Supported Engines
Avro
  • Flat
  • Hierarchical
    1
Spark
JSON
  • Flat
  • Hierarchical
    1
Spark
Parquet
  • Flat
  • Hierarchical
    1
Spark
1
To run on the Spark engine, the complex file write operation must be enabled to project columns as complex data type.
The following table shows the complex files that a PowerExchange for HDFS connection can process within ADLS storage in the Hadoop environment:
File Type
Supported Formats
Supported Engines
Avro
  • Flat
  • Hierarchical
    1 2
  • Blaze
  • Spark
JSON
  • Flat
    1
  • Hierarchical
    1 2
  • Blaze
  • Spark
ORC
  • Flat
  • Spark
Parquet
  • Flat
  • Hierarchical
    1 2
  • Blaze
  • Spark
1
To run on the Blaze engine, the complex file data object must be connected to a Data Processor transformation.
2
To run on the Spark engine, the complex file write operation must be enabled to project columns as complex data type.


Updated July 10, 2020