Cloud Data Integration Connectors All Products
When you run a mapping that contains a filter query for a Snowflake source table with special characters, the expression string in the filter condition is not considered valid and the mapping fails.
The warehouse field name does not reflect as a mandatory parameter in the Snowflake connection properties.
When you use a Snowflake ODBC connection in a mapping enabled with pushdown optimization to read data from two Snowflake sources that have fields with the same name and you define a filter condition for one of the common fields, the mapping fails.
When you configure key range partitioning for a mapping to read data from Snowflake, the throughput does not scale linearly beyond two partitions.
Workaround: To improve the throughput, specify
Additional JDBC URL Parametersfield in the Snowflake connection properties.
When you import a Snowflake object, you cannot differentiate between tables or views from the Snowflake metadata.
If the lookup condition fetches multiple matches and if the multiplicity option is set to Report error, the task does not fail.
If you select a single target and write to multiple tables, the task fails.
You cannot read from or write to a table that has special or i18n characters in column names or the table name.