Table of Contents

Search

  1. Preface
  2. Part 1: Getting Started with Snowflake Data Cloud Connector
  3. Part 2: Data Integration with Snowflake Data Cloud Connector
  4. Part 3: SQL ELT with Snowflake Data Cloud Connector
  5. Appendix A: Data type reference
  6. Appendix B: Additional runtime configurations
  7. Appendix C: Upgrading to Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

SQL transformation

SQL transformation

You can use an SQL transformation only to push certain functions and shared sequence.

Use functions to run queries

You can include functions in an entered query in an SQL transformation and run queries with the Snowflake target endpoint.
You must use only the SELECT clause SQL statement to push a function. Specify the column name in the select query or function. Do not push functions using statements such as "SELECT * FROM TABLE".
You can use the following functions in an entered query:
  • UUID_STRING
  • RANDOM
  • RANDSTR
  • SIGN
  • CURRENT_REGION
  • CURRENT_ACCOUNT
  • CURRENT_ROLE
  • CURRENT_USER
  • CURRENT_DATABASE
  • CURRENT_SCHEMA
  • DAYNAME
  • SPLIT
  • SPLIT_PART
To use the CURRENT_ROLE, CURRENT_DATABASE, and CURRENT_SCHEMA functions in an SQL transformation, ensure to provide the database, role, and schema name in the additional JDBC parameters field in the Snowflake Data Cloud connection. If you do not specify the values in the connection, Data Integration inserts null to the target.

Reuse shared sequence

You can push a mapping with a shared sequence defined in an SQL transformation to a Snowflake endpoint. Data Integration writes the data in the same sequence to the target as in the Snowflake source.
Get the shared sequence from Snowflake and define the sequence in an entered query in an SQL transformation.
Specify the shared sequence in the entered query in the following syntax: Select
<Snowflake_schema_name>.<Snowflake_database_name>.<sequence_name>.NEXTVAL

User defined functions

You can configure a custom query in an SQL transformation to read from Java or SQL user-defined functions (UDF) in Snowflake.
The following guidelines apply for UDFs:
  • You cannot read UDFs that have newline characters in the UDF name.
  • If the UDF contains array parameters, the mapping runs without SQL ELT optimization.

0 COMMENTS

We’d like to hear from you!