Table of Contents

Search

  1. Preface
  2. Part 1: Getting Started with Snowflake Data Cloud Connector
  3. Part 2: Data Integration with Snowflake Data Cloud Connector
  4. Part 3: SQL ELT with Snowflake Data Cloud Connector
  5. Appendix A: Data type reference
  6. Appendix B: Additional runtime configurations
  7. Appendix C: Upgrading to Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Hierarchy Processor transformation

Hierarchy Processor transformation

In advanced mode, you can configure a Hierarchy Processor transformation to read hierarchical or relational input from the Amazon S3 V2 or Microsoft Azure Data Lake Storage Gen2 source and write as relational or hierarchical output to the Snowflake target.
The Hierarchy Processor transformation processes hierarchical fields that represent a struct or an array.
You can configure a Hierarchy Processor transformation with the following restrictions:
  • If the array element of the hierarchical output field contains the fixed-point Number data type, the mapping runs without SQL ELT optimization.
  • Decimal values of the Double data type are written in exponential notation.
    For example, when you write a Double data type 2341.6789 to an output field in the Snowflake target, the output appears as 2.341678900000000e+03.
  • If you select
    Use input group or incoming fields
    as a data source and read hierarchical or relational input from the source that contains more than one row and write as hierarchical output to a Snowflake target, Data Integration duplicates records for each row.
    To avoid writing duplicate rows to the target, either select
    Inherit parent's data sources
    as a data source or filter child fields from the data source using the filter condition.
  • To write Integer or Bigint data type from a Struct field to a Snowflake target, select
    advanced.custom.property
    from the
    Session Property Name
    list, and then enter the following value in the mapping task:
    DisableAdvancedMappingRuntimeValidation=true
  • You can't write data from an Avro or Parquet file that contains multi-level struct fields to a Snowflake target.

0 COMMENTS

We’d like to hear from you!