Table of Contents

Search

  1. Preface
  2. Introduction to PowerExchange for HDFS
  3. PowerExchange for HDFS Configuration
  4. HDFS Connections
  5. HDFS Data Objects
  6. HDFS Data Extraction
  7. HDFS Data Load
  8. HDFS Mappings
  9. Appendix A: Data Type Reference

PowerExchange for HDFS User Guide

PowerExchange for HDFS User Guide

Rules and Guidelines for Creating a Complex File Data Object Operation

Rules and Guidelines for Creating a Complex File Data Object Operation

Use the following rules and guidelines when you create an complex file data object operation:
  • When you create a data object read or write operation, you can add new columns or modify the columns in the
    Ports
    tab directly.
  • To modify the columns of a complex file, you must reconfigure the column projection properties.
  • When you create a mapping to read or write a JSON complex file, the Developer Tool uses the first record in the JSON file as a sample for projection. If the value of an attribute in the sample is null, The Developer Tool defaults its type to "string". You can modify the columns under
    Enable Column Projection
    for data object operations.
  • To modify the columns of an Avro, JSON, ORC, or Parquet file, change the complex file file format in the
    Schema
    field of the schema properties.
  • When you create a mapping to read or write an Avro, JSON, ORC, or Parquet file, you can copy the columns of the Source transformations, Target transformations, or any other transformations from the
    Ports
    tab. Then, you can paste the columns in the data object read or write operation directly.
  • When you copy the columns from any transformation to the data object read or write operation, you can change the data type of the columns. The Data Integration Service resets the precision value of the data type to the default value.
    However, the Data Integration Service does not change the precision value of the String data type to the default value.

0 COMMENTS

We’d like to hear from you!