Table of Contents

Search

  1. Preface
  2. Part 1: Getting Started with Snowflake Data Cloud Connector
  3. Part 2: Data Integration with Snowflake Data Cloud Connector
  4. Part 3: SQL ELT with Snowflake Data Cloud Connector
  5. Appendix A: Data type reference
  6. Appendix B: Additional runtime configurations
  7. Appendix C: Upgrading to Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Snowflake Data Cloud sources, targets, and lookups

Snowflake Data Cloud sources, targets, and lookups

When you configure SQL ELT optimization, refer to this list of supported Snowflake Data Cloud properties in the Source, Target, and Lookup transformations.

Source properties

You can configure the following properties in a Snowflake source transformation:
  • Source connection - Parameter, Allow parameter to be overridden at run time
  • Source type - Single object, multiple objects, query, and parameter. You can also use a parameter file to override the Snowflake source connections and objects in a mapping from the mapping task.
    When you use the query source type to read from Snowflake, you can choose to retain the field metadata and save the mapping. Even if you edit the query and run the mapping, the field metadata specified at design time is retained.
  • Allow parameter to be overridden at run time.
  • Query options - Filter and Join. You can use both simple and advanced filter conditions. You can use join to join related objects based on existing relationships or you can create an advanced relationship.
  • Database override
  • Schema override
  • Warehouse override
  • Pre-SQL and Post-SQL
  • Role override
  • Table name override
  • SQL override
  • Tracing level
The following source properties don't apply in mappings enabled with full SQL ELT optimization:
  • Warehouse override
  • Pre-SQL and Post-SQL
  • Role override
The following source properties don't apply in a Snowflake source transformation:
  • Query options - Sort
  • Partition

Target properties

You can add multiple Snowflake targets in a mapping. The target can be the same Snowflake target table added multiple times or different Snowflake target tables.
You can configure the following properties in a Snowflake target transformation:
  • Target connection - Parameter, Allow parameter to be overridden at run time.
  • Target type - Single object, parameter. You can also use a parameter file to override the Snowflake target connections and objects in a mapping from the mapping task.
  • Allow parameter to be overridden at run time
  • Target object - Existing target, Create new at runtime.
  • Operation - Insert, update, upsert, delete, or data driven
  • Database override
  • Schema override
  • Warehouse override
  • Role override
  • Pre-SQL and Post-SQL
  • Table name override
  • Truncate Target Table
  • Additional Write Runtime Parameters
The following target properties don't apply in a Snowflake target transformation:
  • Update Mode
  • Batch row size
  • Number of local staging files
  • Rejected File Path
  • Update Override
  • Forward Rejected Rows
When you write to multiple targets in mappings in advanced mode, you can only use the Insert operation.

Lookup properties

When you enable SQL ELT optimization, you can configure the following properties for Snowflake connected and unconnected lookups:
  • Lookup connection - Parameter, Allow parameter to be overridden at run time
  • Source type - Single object, query, parameter. You can also use a parameter file to override the Snowflake lookup connections and objects in a mapping from the mapping task.
  • Multiple matches - Report Error
  • Database override
  • Schema override
  • Warehouse override
  • Role override
  • Table name override
  • SQL override
  • Tracing level
  • Lookup Data Filter
The following lookup properties don't apply in Snowflake connected and unconnected lookups:
  • Pre SQL
  • Post SQL

Guidelines for mappings

Consider the following guidelines when you configure mappings:
  • For a target created at runtime, ensure that the Snowflake source does not contain records with the Time data type.
  • When you configure filters, consider the following guidelines:
    • If a mapping contains a Filter transformation and also a filter in the Source transformation, the mapping consolidates the filter conditions from both these transformations to filter the records. However, it is recommended that you use only one of these filters at a time in a mapping.
    • You cannot use system variables in filters.
    • You cannot apply a filter for query and multiple source objects.
    • When you configure an IS_date function in an Expression transformation, specify the format for this function. Else, the mapping populates incorrect data.
    • When you configure two Sequence Generator transformations to write to two Snowflake targets, and the sequence objects have the same sequence name in the custom properties, data populates incorrectly.
  • For mappings that read from and write to Snowflake, consider the following guidelines:
    • You cannot use a query to read from stored procedures.
    • Even if you decrease the precision of the Snowflake String data type in a Source transformation to write to a Snowflake table, the mapping passes without truncating the data.
    • When you configure a mapping for source or partial SQL ELT optimization, do not connect the Source transformation to more than one transformation in the mapping. However, in a mapping enabled with full SQL ELT optimization, the Source transformation can branch out to multiple transformations in the mapping pipeline.
    • You can configure a custom query in a Source transformation to read from Java or SQL user-defined functions (UDF) in Snowflake.
    • When the mapping runs with full or source SQL ELT optimization, some of the queries in the session log are not aliased correctly. The alias for simple queries reflects properly.
    • A mapping
      or mapping in advanced mode
      fails to read data from multiple tables joined using related objects, where the tables and column names have case-sensitive, special, and unicode characters.
    • A mapping that reads from multiple Snowflake objects that do not belong to the same database and schema fails.
    • When you use the is_number function, the data populated for some values such as inf, inf and NaN in Snowflake differs with and without SQL ELT optimization applied.
    • When you use the IS_NUMBER function in a transformation and the input data contains d or D, for example, in formats such as +3.45d+32 or +3.45D-32, the function returns False or 0.
    • When you use the IS_DATE function in a transformation, do not use the J, MM/DD/YYYY SSSSS, MM/DD/Y, and MM/DD/RR formats.
    • Mappings that read from or write to Snowflake with multibyte characters in the table or column names might fail. Before you configure a mapping to read from or write data with multibyte characters, set the
      -DdisablePDOAdvancedAliasing
      property in the JVM options in the Secure Agent properties.
    • When you pass columns with Null values in a Normalizer transformation, Null values are not written to the target.
    • When you push the DATE_DIFF() function with the date1 and date2 arguments from a mapping task enabled with the
      Create Temporary View
      property, the function returns the following different values as compared to a mapping that runs without SQL ELT optimization:
      • The function returns a negative number when the value of date1 is later than the value of date2.
      • The function returns a positive number when the value of date1 is earlier than the value of date2.
      To get the correct return values, set the JVM option to
      -DFixSnowflakeDateDiffForPDO=true
      for the Secure Agent in Administrator.
    • A mapping
      or mapping in advanced mode
      fails to write data to the target when the precision of incoming fields exceeds the precision of target fields.
    • A mapping enabled for SQL ELT optimization fails when you parameterize the advanced source filter and enclose the parameter value within quotes in the parameter file.
      A mapping configured without SQL ELT optimization with similar filter configurations does not filter data but runs successfully.

0 COMMENTS

We’d like to hear from you!