Table of Contents

Search

  1. Preface
  2. Part 1: Getting Started with Snowflake Data Cloud Connector
  3. Part 2: Data Integration with Snowflake Data Cloud Connector
  4. Part 3: SQL ELT with Snowflake Data Cloud Connector
  5. Appendix A: Data type reference
  6. Appendix B: Additional runtime configurations
  7. Appendix C: Upgrading to Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Source objects and operations

Source objects and operations

In a Source transformation, you can use a single object, multiple objects, query, or parameter as the source type to read data from Snowflake.
Some restrictions apply with certain source objects.

Parameter source type

When you parameterize the source object and connection and enable the
Allow parameter to be overridden at run time
option in a transformation, you cannot override the object name using the fully qualified name such as
db.schema.tablename
.
You can pass the
db=<dbname>&schema<schemaname>
values in the
Additional JDBC URL Parameters
field in the Snowflake Data Cloud connection.

Multiple source type

You can use a single Source transformation to read from multiple Snowflake tables within the same database. To read from multiple Snowflake sources, you can create multiple Source transformations and then use a Joiner transformation to join the sources.
To read from multiple tables using a single Source transformation, select multiple object as the source type and then configure a join to combine the tables. You can either add related objects with PK-FK relationships that are already defined or you can define a relationship condition to join the tables. To set your own conditions to define the relationship between the tables, select
Advanced Relationship
from the
Related Objects Actions
menu, and then define the relationship. When you configure a join expression, select the fields and define a join query syntax. You must specify only the condition and not the type of join in the query. The condition you specify in the text box for the expression is appended to the join condition.
When you specify a join condition in the advanced relationship to join the tables, you cannot override the database and schema names from the connection.​ You need to manually change the database and schema name in the advanced relationship condition. If the condition includes columns with a fully qualified name such as
db.schema.tablename
, do not configure an override. Delete the fully qualified database and schema names from the advanced relationship condition and then run the mapping.

Restrictions for the multiple source type

A multiple source type has the following restrictions:
  • A mapping fails when you read data from multiple tables joined using related objects and the tables and column names have case-sensitive columns.
  • A mapping configured with a join for one or more tables that have the same column names fails.
  • A mapping that reads from multiple Snowflake objects that do not belong to the same database and schema fails.
  • A mapping configured with a join that reads from multiple tables fails if you specify an override to the table, schema, or database in the Snowflake source advanced properties.

Query source type

When you use a custom SQL query to import Snowflake tables, specify the Snowflake database and schema name in the custom SQL query. If you do not specify the database and schema name, the agent considers the database and schema name specified in the connection properties. The table name in the query that reads from Snowflake must be a fully qualified. When you use a custom query to call a stored procedure, ensure that the role has access to the database and schema.
You can use a query source type with the following restrictions:
  • Do not configure an override for the database and schema names in the connection.
  • The following operations are not applicable for the query source type:
    • Filter and sort options.
    • Source partitioning.
    • Advanced properties, except for pre-SQL and post-SQL statements.
    • In advanced mode, you cannot call a stored procedure from a custom query in a Source transformation.
  • Do not configure nested in-out parameters in the query source type.
  • When you use a parameterized query source type in a mapping to read from multiple tables, and you configure a join for one or more tables that have the same column names, the mapping fails.
    For example, see the following SQL query that involves a full outer join between two tables EMPLOYEE and DEPARTMENT that are part of the SALES.PUBLIC schema, where two columns have the same name, CITY:
    SELECT EMP_ID, NAME, CITY, DEPT_ID, DEPT_NAME, CITY FROM SALES.PUBLIC.EMPLOYEE FULL OUTER JOIN SALES.PUBLIC.DEPARTMENT ON EMP_ID = DEPT_ID
    To distinguish the conflicting column names, add aliases that the database can refer to while joining the tables:
    SELECT e.EMP_ID, e.NAME, e.CITY as ecity,d.DEPT_ID, d.DEPT_NAME, d.CITY as dcity FROM SALES.PUBLIC.EMPLOYEE e FULL OUTER JOIN SALES.PUBLIC.DEPARTMENT d ON e.EMP_ID = d.DEPT_ID

General restrictions for the source

The Source transformation has the following general restrictions:
  • You cannot read or write Snowflake table and column names that contain double quotes. The mapping fails with the following error:
    SQL compilation error
  • In advanced mode, mappings fail to read from more than 500 columns and result in the following error:
    HTTP POST request failed due to IO error
    .
  • You cannot use system variables in filters.
  • When you configure multiple filter conditions and one of the conditions includes an IS NULL operator, ensure that you place the condition with the IS NULL operator at the end. Otherwise, the mapping fails.
  • When you import a Snowflake object, ensure that the schema names do not contain a slash (/) or backslash (\). Otherwise, tables in the schema with a name that contains a slash (/) or backslash (\) do not reflect in the user interface in the mapping object.

0 COMMENTS

We’d like to hear from you!