Cloud Data Integration Connectors All Products
When a mapping enabled for pushdown optimization using a Snowflake Cloud Data Warehouse V2 connection to write to a Snowflake target contains parallel joiners configured between multiple Snowflake or Amazon S3 sources, pushdown fails. The mapping, however, runs without pushdown optimization. (February 2021)
When a mapping enabled for pushdown optimization using a Snowflake Cloud Data Warehouse V2 connection to write to a Snowflake target from an Amazon S3 source contains more than two Joiner transformations, pushdown fails. The mapping, however, runs successfully without pushdown optimization. (February 2021)
An elastic mapping enabled with the truncate target option fails when the target contains special characters. (February 2021)
Snowflake ticket number: 00140864
When you run a Snowflake ODBC mapping enabled with pushdown optimization and the mapping contains a REPLACECHR() function in the transformation, incorrect data is populated.
Workaround: Disable pushdown optimization in the mapping and then run the mapping. (February 2021)
When you configure a lookup override with uncached lookup enabled, and you set the condition to determine the behavior for multiple matches, an error occurs. (February 2021)
If you set up uncached lookup and the behavior for encountering multiple matches from the Snowflake source that contains case sensitive tables and column names, the mapping fails with the following error: com.informatica.cci.cloud.client.impl.CCIClientExceptionImpl: Invalid expression string for filter condition.
When you read data from multiple tables joined using the add related objects option and the tables and column names have case-sensitive columns, the mapping fails. (October 2020)
When you migrate Snowflake mappings from one environment to the another, for example from test to production, and then configure an override for the database and schema in the runtime properties in the production environment and there is a change in the metadata of the imported table, the mapping fails.
Workaround: Specify the override for the database and schema in the
Advanced JDBC URL Parametersfield of the Snowflake Cloud Data Warehouse V2 connection in the following format:
db=<datbase_name>&schema=<schema_name>. Re-run the mapping. (August 2020)
If some records are rejected while writing large amounts of data to Snowflake, the specified rejected file might not display some of the rejected records even though the statistics of rejected records appear correctly in the session logs. (April 2020)
When you configure pushdown optimization for a mapping that uses a Snowflake Cloud Data Warehouse V2 connection to write data from Amazon S3 V2 to Snowflake, and the table names contains special characters, the mapping fails with a DTM error. (October 2020)
Snowflake ticket number: 00141167
When you configure pushdown optimization for a mapping that uses a Snowflake Cloud Data Warehouse V2 connection to write Avro files containing special characters from Amazon S3 V2 to Snowflake, the mapping fails. (October 2020)
Snowflake ticket number: 00143259