Cloud Data Integration Connectors All Products
When you run a mapping that contains a filter query for a Snowflake source table with special characters, the expression string in the filter condition is not considered valid and the mapping fails.
When you configure key range partitioning for a mapping to read data from Snowflake, the throughput does not scale linearly beyond two partitions.
To improve the throughput, specify
Additional JDBC URL Parametersfield in the Snowflake connection properties.
When you import a Snowflake object, you cannot differentiate between tables or views from the Snowflake metadata.
If a source field contains more than 256 characters in a pushdown optimization task, the task fails.
If the lookup condition fetches multiple matches and if the multiplicity option is set to Report error, the task does not fail.
If you select a single target and write to multiple tables, the task fails.
You cannot read from or write to a table that has special or i18n characters in column names or the table name.